var/home/core/zuul-output/0000755000175000017500000000000015157244527014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157257621015504 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000337272715157257540020306 0ustar corecore`_ikubelet.log]o[=r+Br t-n{(!9%CM/c;b[>Ǧ(ZΑ4cڜc^*߶Y٬:|fu<ۭ_x~̎+ޜ/8_poL_bڞֻ];YoZO(_-V,<xnƙQʀClxv< |N ?%5$.zٶ'p~U Pm,UTV̙UΞg\ Ӵ-$}.U빱އ0* TQ0Z%bob  oHI\o.f/M1FHdl!و4Gf#C2lQw/]BPIjfkAubTI *JB4PxQs# `Ls3@g(C U {oDtiG'z֝$,z#fǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9Ӌ|YWyCNQWs]8MKf, # qe䧤ꇾ3,!N{\00{B"唄(".V.U) f*g,Z0>?<;^N^iD[NrmN@ Ң`?Tã 5g=XzߛoE䭸[ki|X&po{Wl9HGAr Mme)M,!])V_帛AB}nyи0stĈCo.:wAZ{sy:7qsWctx{ul-+ZYsI{o.Ra97XcђQ0FK@aEDO2es ׇ# ZF͹b,*YVi+$<QMGhC}^}?BqG!(8l K3T[< ~6m)0}(*T7Siv'=k 9ԻreFj?wQ+KmrI,W i̸.#v0nFNV-y(&e'd,LFlPh ۬rW-V5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ oWHWc&)_`i=į`Pír JwJ`}OPSSii4wT  (Hnm//sE炱}r4(9qfhs8u'8KwI~3v4&8[q_5.)Q VE JN`:a!KM/ 9ÿ#:7erԮoQ#% H!PK)~U,bxQVpΣ@Kdb5)%L%7׷fw.3;egO ξD1siYLizpV:Bӽ D>P.BvJ>nIyVVTF% tFL-*$tZm2AČAE9ï~ihFf&6$&̴+sO~p?5!}~B}-{C):fUr6v`mSΟ0c/nVאUʇa )$ {SCBoX^P\Ja 79clw/H 鄌4:B%cXhK I}!5 Y&JO _y@}DS.€>2T0|9ģ7$3ηz^.I<)9qf e%dhy:O40n'c}e1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffEm_Eφǽ{QiOEG{P;SHz"2Zjǽ}W4D)3N*;D֪v3l"<, { TmsGoI&o'Ad.9,Ç"q ChCMAgSNdL0#W-CUu"k"圀̲F9,,&h'ZJz4U\d +( 7EqAqO:rƭĘ DuZ^ To3dEN/}w zI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺbW»ʞX6ýcsT z`q 0C?41-5_n^ylSO2|#hIc*Qqk&60XdGY!D ' @{!b4ִ s Exb 5d[ߤKߒ'&YIL{3ilLJ!Ip,2(( *%Kء#AZ9 K>UHkZ;o︍8MYDa3kp1.m`XIB[9% E*:`cBCIqC(1&b z]fN_idQv7ݸCVA/e6# Okx܍>>o낿kg{9𚃚p9wo#z5A׋yTJ$KOL-aP+;;%+_6'Sr|@2nQ{aK|bjܒ^(מO80$QxBcX; yCùXz!bm5uA߉X})0/>nNNXYt\oP@gV ]cӰJ:q';E=-dZB4']a.QO:#'6RE'E3 */HAYk%C6Θ%|5u=kkN2{#FEc* A>{avdt)8|mg嶚TN7,TEVOy4%-Lq6d@CYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tu7}opY.G]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Sr>Ӽ]\ hSQƗLwfm#Y~!%rpWMEWMjbn(ek~iQ)à6$X?T.|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI>QQ!'7h,sF\jzP\7:_Q\)#s{p'ɂN$r;fVkv?>6!<̅:xn<# -BȢ1I~ŋ-*|`В~_>ۅm}67X9z=Oa Am]fnޤ{"hd߃Ԉ|tLD3 7'yOc& LFs%B!sRE2K0p\0͙npV)̍F$X8a-bp)5,] Bo|ؖA]Y`NKnJۦ^4*rB쑓:5Ǧ٨C.1`mU]+y_:,eXX맻c5ޖSwe݊O4L)69 War)|VϟT;Cq%KK-*i ѩQٰ`DݎGu( 꿢\cXn }7Ҫa nG{Y bcWa?\34 P U!7 _* kTuwmUr%ԀjƮĀdU#^ۈӕ3ΊeBO`^}ܖj49lnAvoI "%\;OF& wctغBܮl##mϸ.6p5k0C5PdKB g:=G<$w!24 6e/!~߽f)Q UbshY5mseڠ5_mTDNIGW .Z#YmDvS|]F)5vSsiExţ=8#r&ᘡĩDȈ\d cRKw*#zJ9tT :<XK*ɤwoJarExfKB4t@y[6OO6qDfEz]1,ʹB֒H ֱw;SpM8hGG&ƫEJި_1N`Ac2 SDL ,FZ<1Kx&C!{P|Ռr,+ ] O;*X]Eg,5,uZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺ?jgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMXʅr}i%X9Qr 6 JH탧~K~#n*3QSԠ}8H0]+ES,n?UU{ x~ʓOy_>?o>l8MrHID2VSsMX^"NۯDc558c&'K0L /C5YDqNe~ض˸nErc֋@aw*r܀0 a {RQXV-/p:MP\<=<^越a/bz?ܓvjIg3MN4:]U]STa,@OKdٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4 A(" 뭗R==9!nKErHc1FYbQ F;v?ob-ڈFalG*rEX}HAP'Hҷ$qM9(AHx!AF 26qxCdP!NZgҽ9l*(H Žڒ;̼|%D Ɖ`Pj . ֈ,ixp`ttOKBDޙ''aLA2s0(G2E<I:xsB.ȼ*d42I:<ŋu#~us{dW<2~sQ37.&lOľu74c?MՏړ@ -N*CB=i3,qjGkUտu6k Cb8hs&sM@-=X(i7=@He%ISd$&iA|i MiʏݸT{r[j顒x.Ƞ"m@Hy_I )j|s#RGI!dTKL&4K>#stV \'xMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlWW q>fu6+'}xu\Veelz`Zbym gp8펠ˋֆ:1IC8qٞ\vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v)vR/egBhAAdh]4H:nV$tHI98/)=mͭ ڐn}}~ק?g_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;n#y 9D*A$$"^)dVQ.(rO6ӟZw_Ȣaޒu'- ^_,G;U\cAAz7EtlLuoXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-qCfqsn[?_r=V:х@mfVg,w}QJUtesYyt7Yr+"*DtO/o۷~|hw^5wE of7cꃱ.)7.u/}tPTGc 5tW> l/`I~>|灹mQ$>N |gZ ͜IH[RNOMTq~g d0/0Љ!yB.hH׽;}VLGp3I#8'xal&Ȑc$ d7?K6xAH1H#:f _tŒ^ hgiNas*@K{7tH*t쬆Ny497ͩ KVsVokwW&4*H'\ d$]Vmr달v9dB.bq:__xW|1=6 R3y^ E#LB ZaZd1,]ןkznxtK|v+`VZ3JϧC^|/{ś}r3 >6׳oƄ%VDSWn 0,qh! E-Z%ܹpU:&&fX+EǬ.ťqpNZܗÅxjsD|[,_4EqgMƒK6f/FXJRF>i XʽAQGwG%mgo 恤hˍJ_SgskwI\t`ﶘ080ƱQŀllKX@116fqo>NrU Ѣ9*|ãeeH7.z!<7zG4p9tV|̢T`˖E ;;,tTaIUle*$!>*mBA2,gJIn_kSz)JC]?X(OPJS3.}clݨ{e!MB,cB߮4af祋,1/_xq=fBRO0P'֫-kbM6Apw,GO2}MGK'#+սE^dˋf6Y bQEz}eҏnr_ ^O^W zw~Ȳ=sXअy{E|\΋"?|NKfֱn !¶6!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$?m\Y#_+%pӴmv(JlȢ'_8iR*uvml9gἐ*`<3&% HcVǻX& H0VXD]id~jPxqd?V]xDcU2^eZƦ=1VExbhW0ZŨi5eUoϛ;]YYq{\~)ϣB6Id Sx!_2?Z5H| <xTcUU4Z ƪqySw#CUn9njn eO cUh׿j1>|,|KMW8hLLǰE5ŲX4^Ӝ"'b$Wc@:8ϐ=2,o{%{%vp0q>7,3bˊ_!%my6<6cʳ"&~nwވq=/nXϋLl+Lbf+a$CMq071Fl{mo MkuAbgI„9^ľH'~S)E&! }n='=}Onyc1}NɘdʵexfnF@iP}/NDồtlѾ4ZF]o16 9 q?6m´/l$fxxSǴ51Q6e=jAɫG,(mX*c,7+& ~~zCH|m=V>`c`]1>6qL|%/DKkT-8%4FLGc8Յ"gGBѨLaI] ԪK^-ҤĎ'볉L e,bkQՈe<T"$Y\J Z/|1D|?=%Ŷ;t=FQT^gV>I]+(pȍÄ3״\+ˣ"'Ah'Ψ3l)!OB:OFb#3*[Si6\̫Hu|^k?S <~|b֌7]r)(DqO/)ZsbEZM Z%(V9@||"eJ^I\šTu)c_-Zꃀe:9|z9Lլe5Sc)dU{KI"b*oj>Y:ֲYnI35%g7a,h&fJ$D)NJL K1I _61Zg~aVh]`BpK..|AI<]9X8W_Q0ҊO''紌俱# oyRh"q=i mo-񇓧3٘t>7 t.^ږir5%PC&NQĭ G4ci1*\i O{yN6^a%c~4\ZOL'^Nq8y |~DzvsW|QWZiO*vv4{+({E3* j4}|E &s]*L~?`f)fr%3Y9FZaGɉze'N30 ww:F )WBjP@uZ( 䌃!ٙQӈDt?!䎴@|y9  o|I>:2Fw|@P)'=zL֤n$~?>D;zF.DhQgwe zG=@'u m똼 gl D~.gu@yDyyKDWA}>Gl%*kw.XWc=y{Y8^rclnˮZh_74;i|)w\em{ !Mù8 #Rih&'hA(o:=*E'XcQPMy:Wj~e1fh6]AX7/8;=~%!Px6{++տƂq0u#%˓< +U6kxwCɥ ݲ)MeXwK=c@9.dW G$Eo_7Q 2;/ʙD |J}/i^ӚN_tgG2:8 ?4oX2A 1nE^K`8#s9g$ess<Ɓ]GSu|~ݟA=zkΎ2-:!S~Eo v!SS ,Dߟ>a=$zS/y5E@_ej/3 sQiJV`~Ǵ 2rrB79]p׉DNZ?֪8wrnH'kjJ(]P)GV[3Y& 2ZomPEYAA"%W;~ 80OK\;Mۃvߌyvy%A䷷GeLs;@Q`WM^фˍq?nE~Meȣ2ZEucN#W,]e"ed lbW롣uo ʅ8LDKh} lw57":7_P5*T!->_ud>%p] <7h]Ca0.B<ƣf> e,!".^Ћl{t>ugcn[,lIOoyeQs|Wҹ Aj  IvRr1L`CeOhՇX)W"uw Ĥ#&OjKX7jؗzw댦K$׮-d&)Q4wtf9ws67H:NkrX\ɯ%Vmto2  `ԝN92=$ M.aUt*- )[jͮdȚtu4(<]f$%\6\2j]`B4伐]*uhE3pJV.hZѮ2F2es:9Yxf&.OHv$A_G0+MÈԺ#bo0iW7檅MX*GW| rWa籱H ^] hL'L\ݡk"bx}mGpۖ%_sI!K {HH=I'ӵat,PZDMWC˞G Kɰ'I,M*@i %jm˳Y@ĴqZi%nBi#Z Cm=b͍uh*x{E\=$kŶ/UHX$E-~!ߗrQfAߚ_;6晘zUֽ\+@RLeuk[,.߾bc;YT[p,[*RM-DC!i«h iň٦%cEbkٻ~4B2zڲzx{heaYHRv8U6HƁa! m[  ]bQ|Q3~]blxa?m w:z.eT)1 Eem xQOVBޭqⱏ-=>n'HaR~v*u2j$-65v--+[ 4XV4(0Ta;Y+S^BWOۣ>7FZьR cNMm|Ղy8.M$$,t#q$4Cۊ #lj}$Is|X7!t5e7-jVC_n7%$\Q"uM:7f|]򳨆6f%[ g7[K#uqU :vC5Y}OW>Ԯ[_-L,:p@7d=IM"i|?-l&'VObG&GDRe.Ѿ#(<|vIWDz'.>^tݎ߀f!(\1Voﳶ˴w[bHhz `kX,c1' 2Mu &ˆH:t]hsƷq8zK`OBÃuGrDX8

zX6ne&")7à ŧ_DLW2Tý#ܓ녉i{ҟ kYps{p/1=kO:_*fn'Y{ߟK0P1A`27]~mgW @tK{]]D˨39yZR]6h8lJ,{?ao:Dp/:ك&@_bt+(,^]O,ڻjUp8|Ty(@duka]At/@"DT=žzO0X'EOٛB5P6'EÛLt @ .-ڍ m?"XXض[Cy4!P {v%q#h(o%^] {X^8Lm] s۶+^K"@9ΣƉ;d<Il$QXO*t4$s:8/2ï?cYrdWf/u$$xQ<} JZqp犄(yϳꊐOçݧq|"pY`6QETDSLz~kbVFE}!㺎:=|L }CxH*-[S ֪TO0zo8 ЏӸL7M=!"@]u<+UoO(>b͑8Dx K6QNbqhЮC&ѭ@{ b1Dž"o?=>?__Ϗ_tt^οq Pu,Z6a=CwP ݰ͚sC W kFA\ H PB +_A_޽P06r^<~A-WJg@FMkJeaV/ng 6(VϠ@%@9%NT%|Q-9 0q2P m@8Zkmu E&D E~MtFdcW{h-&`bhT$" `< 94v#ݪF+Pͧ"Q'Ib6(1=(BZ^S(6yT^;@3ŀ@MQ nAp8#ac`xt8*x0VeD'=-G,AK\.eNoS' i m{m 0G N@'`Ŧ5fox`ˆl氿9RsPsF}cԐ69_&tt m0rY֘ q` Zé*ߖf0 s]k ) d` Q7@ӣ hak$"ln1`o{.W?xskg& I,DJP֐ s8msC%o-y1j*yz y;3o1}n8sXKbO[>bb udIg22gf^Вw =9MXßώq275S Vx2c&|i+ r6 021f#AK^Y PLI(~(?he9iM| L 4?8eZ˙}Lh=lK /6upܿ:T G"yhaZzH^b|t0m)Ԏ*> ؙ;*?b #>C<˲(-5))2 t!Y ܻ=U|&b@*I;BױYqb"V"* sd#`kyx@mL( CHP8cĂ0+R"KMw4VpdJ^AMX5'8/ɠ,FZ׼ޑفUgw\46CX i C7j$#R+A̪n^VtI.Q cJ1Ǹ&x9{.0j_^r'LbT]&D CّqrqvKO3bH V";wPi[кq7)O_="ZvG8z%qx[-<{ pGb5C IZ'N3gz:TQ:˒T6y;x郢5Х˲~U}ي` dLtͺӥ!F|%%U6k05#ѥg~oMm ,; cPRJ/F't@|`ras`Sb=̄VOd9Eq).X!y;jR^+*ޢz+Q|qIwN߉ '^\?M&m;ض,P6N^4xd~դ}qM|:/|1 ݷu"/PKףaq"(#1|23t4TgȜPR0n7W×f^rFpa(߃Q[תZZ{*kU {[XƟUL[z )iQ&(ʚg0˫i /^Q{ nwݐ{Ng~iQf~‘"gc\Ϯ+݆^'OQx2{#1ŸfHNx.0#9'x\߰Uɟq|!==ACd CtYq $ ՟gWRSt '.9Epivp>n?Nf}b?r!\&éK;q$`D/Ġ,43v`2Nat؁|$BԼPg28S5DկzEm9 ` =cVĩP/$?Ǖ}`O\x=[49)MᲲh;`Zk%A Vz棷Mdd(y>L-8I>r㺰N=~(g+Q΢1ÝW n"m2<.KkŨ8}A]x":VHɦl"g10i"Ѹcb^@$뗝gQ%E!:.YYD]Q_hϦ`n |,_< ہK, ;h@iX6=lnۉ`v-OO4Apͣh'7:RvftqSuY榰D0u_bRAI:3vF:F>^96܀MpqG<$F ydÈBˠC9ME]Y~#U4N>,tA)&knWe R|۟J ;Jw'>P'tBWҧtNOUoҝucw}}wһlZlwZHe{Rv Y&فPgwBI|P|wB{w ]&݁PwwBݧI2z=PoOB w ߝPi{@hLhO#4ؓ`BeB w'4|ង]%S!%>&g u\܎߅bP{q-HV~1jOѵ{`La|O 1kT;+ s1`@ ~a@ԬYV|C1 .B7r*z9 VH y/C .!,D?nhWDm6%;9Ӣ߬:T?yp^d!>9s^7C׶(U0Em?[#2l<Ń̞ b 7Gpꢾ8 Ͼ8#_1fFR7xܱ|wWLAgKvE W9ur|]é RkJ~`"4cڪDf-u$:2`I-]ti~pu!R-F|ݍBW|ΪP?>{)H!ă5U%6a+YU8fkG7"Sݶ{Ƀvۢ'̀^sPƾ$=1$*2f#.iGHFPAemǪ,)΂k*r38;s6.S 3BkQ 3,,ÛG rebRdfn2jv}{.*J@cnR$d撳𒩦e"dD7'>ڨYyLe2GuZOoG|i^@hf ]:Qm#/lb6툿 w?Կ2_[vp2Ll6  !~igyºH|{p*DoPOop7bK\{p!jSb4);R0;@ A{{f82jfQe Trn=a59cYgU=vp$ ryg ͙1]r[nyG˩S|@*sD]F$>h}l8m$pX8"sžZ,hDJaî5&8|I8<|gw跋Oy,U7q>剌P_@)XK 0¹`T33h~829<~),<O?,6Bףvx<B9N-nJ$oJfoX4&~Cʈ!N+o_n6WjcCycN A=Noו7o4&֑lڍ9ka.yzY쌏]:IAL1Z1}pt'%8(Amh(# Fh~A)8J謈- ^c b$pA7z4)ptWn‚cK·Aa F7-Q?'8,y_ s57l`<5Ač؋jθs)P }Op*"&ZmUԲr2,}&zӰXpTpb2Q϶mt$"O/ w!Ňh^KE`ۮ7,ٹ۠K ٜD6L+t6$}KG3rB"%mwTs.g#ArC&>j$yȺ&eEKG8ּybYzr`"dʺV !, UA*TIV։`q#km. U(9+"gsGJ4% @;HtKG%>TRZe$܊2C,Ik.z=GtrG3dz!qn0LUtV٧6a}cjGp1ݶuit,y*p(xW]ԓki!}sD,}B't'aSt"%ZTg WAZכjEұP7YH/[jE >gP|4%@xTIK:8qxN TɛG:l(ctEZdϝx0j4׻,\Yrs=x:Wxzuu/S;;Ǖ},qnPgs>#G~:|ϮzEđ`ǧ}(8Fa*~*p9\rndDV1 T$YI\9^" O%Ƌ8f6p%LK;fG~MyXoabы^3MR61M}ڃIٻ)@ I%"CY&'brMNY&=+qQ9^31g%J0\vPV+/pl;+tb@ZU<~:̓-j!,K߰h'K˒3tB@ zEe}K`>YpˮT| Aϐ +wcNoo%=~T. 5 ?xd^xWK4,ll1 /r6#CC@`P,ڈi1+rQ렗ΡyρH}ț'6#e8G*9YBYpOivy_1~K#`MC|ЄG1d yi-'(bd6xyYWr4kbXN7O,8zsVHj.'2b4{HtfOg1\rv͐eJjyGmҗ\LFʇG䷳p`X4 6-V[h1o$VC=Sgi:eՒD "A1_t?p﯎@c8L'hی,(FZbouȀ#wUb6Y|{.x~k32sw1H+ vcHPF~s~Vj6J`etgUw(T󍷹wwPMqJ*1:MCu ⛴mr'%$۴n-UO)I,"@fu޲G89xӢtM/,8yVBf3fٳhY$d8E!a:jjdJ>;[wH;3ӬVY67ݕ{4e^=,0N憧M$bXW,9y_QgF漢K^s F 84`(0~r}rscgHOXdg%޶L}ǂp}mTÂSo!n;", xV*{zCUbfr[©̐2M;Yhdj8[~ 5:LF('0BK LlsAXa&a%y\o7PSeZu^KYKBM)s2v#KA ejGPo: P^]nY>sjFv1ҏ,;˥:&o?_8Y"ukt t 9)h)\3绻k6褼` Z%& 3嬱t B#n^LX|,.J:,r/tWS)Yw=&_8,ˇnLgq$ ]#5V!SeGā/.}j$m^=EM"7\L"KOR,KlQ4>ᗫzmYpZ)&_.n*L7~ EN~34*ٚ4^T阈I0 O;|ܙl?ZۡT-}t%a[/,8_گA*_w⬚cfKʾd' Oմ\hsP4-YuMD*DȆX6ҥ5>Ыݩx %~B8{t|Wp8$~ ҇`AR$E1"HQhKV HܬnGrX|hw}&wGMX_6{$O|R:+ C1Q?B(%Ϥ|ꫂ2He v[MuX\Sd>d;O,͎Z7NYU#7I 2ZWNݦX:{bqSe*Q6NQn9=Z^ $U$n0앱*k! GjsbȌSsﻗ=,/=(gg*L xvm0 fhBQ*N,X:±}˂4e{>|$n;!Yau1h2`n|wwJlrNL_>r +I{C.s„{I Ϫ1\R91Rz.x(3)<߳%OnO0F1=j9DGU}7 |i VVJ߷t!dS T*V D'd N,ʜXE:e2-l; E1|׊  +)ROQB5aҸjY_t|\Wݹg,.VXh$m`Ts.gӷ3;Kj=*样}|hWLONN<5YLK,iy<@ۓt | C5;b+Y2#"J4@jsvlDRHW/;b\93_[OCg˞6{V-BJ<fN\s?RY[Sea | -?°-EDz%Vz&KZں(#=E<5SGKυe/k%ϊL_t>=\:dzwRZT2C,#eZ \<5@4ab-ބ#''iNmPS Ew 3JƅѮFS )WF-gվpF >8;z`{cyjߝxE7 8yH8%<5FLIwj-MAp[>dH{dՐSo!32(P ]rǚ6ށ C<w NNҜ:}  y $>nI`Gr{R`my/QG7Cw?9Sɫa"N85Xq:;YSFš4r1LHx7z&̰d9S>R ǃR&g&nwPjݮ:`&.i(k%KeD j<[?R Tk1)HythbՍui9&"ыj>7'c 3#kS*TAXQ.EV2hV_F'?YKj_v@h\$.H3kwVV};<7zOi˒6^ri.а*[$Ͽhj~Fnn=`01yM#z$"(Sp.w;vlO-"Z#90Th/ɒ28Sc7Y5; KY}0t֞-{9ej{_ ]4 _E1 l;E@e"Ա=JJB;؊DJIbK">:2 9{(n>&Z]HN/Vo |ɥb:B1)3UV$l1X3P_Ƙ3hrbyx՚ zWfb>R~befw\=&@q2}s,^Lz^OKRé*KfscF]Ĥ>`b֋[OQ墠<7b^qgc~C7MS;ޜ|d]$87dlJOf.B?<;e sTefG-GMnBHXRSQ͕+7ׄ}m}ے S,uO fӪF8utǃĕٯ2-WZb6[uÎم>;(O$J6lKA3'e1fi8b1d4j׼ڷuwCNYT hM+1~Vw i8l(bkKrmv1w8T6=2xNW_TsVIMIjrs Bًwq獃z+W5K,1{AL=;B]Oș&`[ԀƩ@`ɇ3N*I2N tJ Ō++GUaFVBHOp7L(JŤ[_o="o*<DI:N( BUp0, !icj@%Ivl8N&Y212 |<@VCUMև]룶K?70lQ}܎ ,}vH-G4QZN#nU2G3>1λdH\UtjUF)^"&gE|u?T]2x Q5K~G7L/5_peyٴńs,P)Q9V9e{7E~[ߺLJ֕ {)Zꮬlt߸mMfFHl{,Rq⹠RKswкlǍl4~FK05P[pyBjk+9Z)nj%֕rgKlkVJ"z.1[cgSȆ:ZrKȵ *dp,=d8*]C6c`ɦn`7,xjx//&[{SNU{Vw%YNp<94wH)ʭsҜqA)~NѤD>k[ᶿ8-.E8u88HqJ`N:_Vt+)FHk[Ey*dm'UYލ釓7WK뙴 ҪY3(LCXmIڸZ}ѫ_pZ;p%qA\)|ib`OklO?I0E&0_EUw_X$I/mI2SH_$dB;0~aͷ/ebBwp3Jsi;S/BmBj˴Z~u>jP˄݀Ke7 h2_a-!j2P}6-Ӈ'%}xqW7 ^ Q%b{k۵/žڠ/S"_zUox>KKuoܞn"؋mͼPVЊw?6oPTye 1!y4B9NX'su  qaǽ|1+3nօPirSĬ%A9 mi:ly ")3JkbWr_Ip|X׊cI Tżm ̫:+ ׹w1Bk=}ؗb8k= eGV^Q-_Z"9J6jQNM$7fsM4h;e9B;Άϭ1{pֹesp1A66]+d6VpE |g6bNFZ)̓>'x)Ȁ+Bq ą"\kq׊:iqŔtCy}雕6+mioxb]Pyε :GeTimu"KfD.@kV' btcZQ'ZyBPPh Infi@hSx*³X1e,Vȑȱ9w6XQ?X38Ya [4r}ģg~VLH1o;jh,~>~Z};^XWEBz\#_A>0GyLӴ%L1WUl|$1-uG]z];f@XG7ո= J~xJ5q@X 2q(.̞/QZяq@9/vpňJD2\m#ݛY0A ł~&- WS~3 J!b&q5x񡨒lbiPvtәE@!H""d( d/}knFGrbN/oN~z㷇!iȭDw"wun\96sWTBr,hbGũF3*F5LiK  #^@un8t@zUH޸;UZIa ]/L Z_chRvK= (As( \pg BA2锗(RU4ըcƷO]ўu֞"U(vR,頰(vpxS(PÉ7 %Td#.b,芹`\;AG &2X.@HP)L)ncqA31m緗m->LCgꄓ: *!QE11bH "̆c@\†c0z+j]يr1$[,,nPN)x*wDKEs9f^P +m[VX}8 N;\&3hAD[6k`G4lVb`DI9$cp +̐6PpQbZU"M0|y a Doɵ`-mrEoWϿ:x{~ (ޞ踂ly3=p( Ã2!!E &Dn !&ɐKlo+`j!YMlU*J9H=a?=g'SVj}"1ԛ͵"—ݮvjlʲUN:Y]G(\D9h2~?vYnFO彵Bl[;~bﭕ zowHKwyhw #13Q n4YFlyՄ4+nJ,f{~\R ~^͗,>EiYMϯfc3'V| ]X\SaBs HMy~/IB9d!EsKDSq10ǖkKO<չiH,C6&Zy{F>h#.#-2'b!9"rԜ !NvTgO a-8 bEBzdWSrB#'\yp&J{+w_W'? ,`(Cq ;saPfP ATVR{a >[;y6|B0HkAU'@h yp&sF%BH )taçxp# Zp}PhBh\R1UOHI@̚q4l+-QrluϡPb̜9]Nź"q͈ S4h'^U!lq 2j6+\8𫅳[Hg: 'Cxv >#.ȳJ%sE%X[.8;bnk(ڄ' D6Dèjl 5xf8KCݡK M`'VR:C_IIhs;_rHc2pEW6,)ijspycY*< 3S>C2t~H?f*!} ]#f\+/Q*:\^Ƴ]Kws?a!I> 7T%_VFHR*tv &,1: Z+Ǜ,BONPlN8<nC6ay3mYםD֢uYw _2j\RG٫w{fim&Awk2΢^oȉV_$˭WEoFQε"il)'32^0rOajYӬ=^\Ѹ6(!]Ώ}rr<נ+\A#t^ΣU3Som~ˏ?ۜ1Z?yʊ۾ZY[OmFl}JJn*%lm]Q^Je+=^U}iS?,K> JQ-RA+Ϧ<2q+!rupk)ҝtwݝۈ67|ʒ'$槤_ǠªCy+e*Ke&>LhuYnU$ un +eM`Y Ax<T._RfZRN_CQyV)Qz~929^.k]q8;tۑ}usF-SeR`>-.i7I'p޾GGrG0),Ywww 58//hXqMl~sq]\6sZy/@ic+a x{3qQ뫋x>Gm;XلIszҖTHM-TZ#umu-njT=>L⣽\&FjM2_OfO诮N`; o^Sopvu.6ן^cExa)˟2ȅRj:廎#G f6vn®˫Rw\ 5 ]+.Y.kxȫ),=vqov_pO8Th6?-"NMbU{{\A0-*7I Clp6牴.,re1w|saKY,!r׮iЇ$:=*'߻ubD)pJX|$á! Ij bFQH=Rmji$iFHUr8)_?ÉO@Q0=eO}  rZtt00Tǿ?^l3G罃n}<27wE"kׇv2ՠJ5,$BȨٴf଴BRFNj%JfɀXo<3h)y(ʣC]`+hP7pZm*h8܅-u"QO`E-pekCU>m׫wEE9gAEA\5EkZ°zt> f'nPw=g5'bVJ-AykCjm4FyilQ]^-^ɨ<ף˳5.1$P$Lśg2fnX|a3"Lf#Jg60^t{&y9jiLՂ/ǀJZJg$qm]}YAHzo"\Zw75}5M0Vi̳}^FMG,gK&\HM. L&t]N`Cg82," -!w9Q +@673a`L?rhIyS[C\.%wzu b\״вRm/dIO}HDa4P҇h1Q*&}ZUn}Ms tI_s>?Ef!cB;=ys̸/ѢRn~synsݠ:ŏrI7t}V)Y(t5Ypʳ{=NO.9\m\>H!-(l %cv)'F%]4VG8RϕQ{UkE]k&VDQ +o3}6wcѶkwk'>ϹŭxL\Շue鲻w0-mhIWZtXm]I7WP#!VXTc# 1/ MN\ nz6|H MH7En^BUp k#!7XSX;aowɚvдsi4ccX_+1G^HQ|/?͎kD[J6j]N.J4d\qPڹLČrJ}+\ڵTJw(Fz/0jpbp9jSul$9߭%Ča.A^ IvK?/ z;pG*||gL Ȝie[,-;fMb~/6ݕ AUuID“#M(w[xr܈T LQ|0 T'-'LG Xy:ODh-㪓gphI󶵰}n4;޳)b:H}MufqzXa^$Y„y*Q? &T-4xX i8-p:̧ΊRR+*t,,X`e5;8reiH+ HX$dyx,${ֻwa˒-wlΌ1uύ$|_?ve4ߒ?y3[`'ߒU0EQ`w\iA}l_m-snf̸s_V%7.ny|8 .g|m&4ݻED9{$>؜DMт 0R$Wp''ɨ]_$F!ܺj%99%qPL9.)G?ڇT؏~(s({1] 뜎ԩj#FcQE!bw﮺6}B?.ogMw}u[YC;b Il h PcbԈHP7 v/mJ\5AR NBHUG{*yrTSwR&S42CA2U~S:2MqȴGH+ 3"j?rD$^VG87(R*XɜJ9_J#"V: @s"/GʏNouzgĨ=}xRM:z=J4ʔcdjvu{{n/ҜHIDP LOEcJ\T]=Vw"SaVK)V^@H8'RM(p=Z7:&)`>~P=J}-rR(T*O9jPlBE3sTSCGmqN{gwF B925ׁRG:<X`NIi4 έg#mJ93ͣ8xcS!v$R@HzRe$56QƦM<'S(\L-ަl딉|N|I]uX\ߜ=);pC}66l{X_͇|Лm5鞧_G_Q{t ?c9!!3'p߼mmoz@1u#;lm!P%Ǵtv60׽?a[/| ";*=线 t2xsus{@ۦ;p}Ϧ˫'к?3̍Gݰg7R7"}P:p:m㷱fa}\  H##xmx!rh0 w9mŃ*htl/lK:,J/IwqGP%x= 6Bi\6vo3rTqJ !F[`m44VZhTOȩ}U2Cq=&<:x1iPe=瑏6 ^t@]߹HSбcv`6BDHáe#pH+C3H34pW2/hvf?7UO{ZwVXϼsEؔ>*S}tBִ2{V`N68h e)#X h4{&R h+-bz:@ )N5ND\Ţb@J-`P2ZS<˸\!EHaFμwrnD=kHXG'}!ΨVDh#PN\@r'dh ]#]iŗӊC65fZoV$1#pdH1k, …Xz.~cV`H1G\+ Tn0P(쯢̆U؉0no֟4^)FJ5b>m*_!}pmw}~taun}ޞD@B]4.6Abwwc? 6bL߭Sd R3KSc({@gH1# \ˢ/EGxE wIv!ĢbVFК*F~gFw;"nzxͰE٘f|DԢb Y@5feձ:r"'I(&k3YFʁC=E7֕t4g!Pu8հ t@hg~]<,fYޜ, $s!$:ȓbd1/:yV#qïgt u^DfdBtyX4R;JI9qk]yJ, XMXd MJb nUKJf_5)齅(M 8d)# gW8p}{~ٞUjU}it@t5 ؋&:ԨajG|q&84VY&r`lX='†꿺鵬ZA)Te`H9 &鑗#霗aZ#{M,U |cɒbXBF e4]W4i2=iL,gdO>yX2R;$=vr uT;9r:9e~No38nM ET",/)F ZԺ}~׼NyoME[ݻ>8]ءhd'GHA^ (Cz4;%LLƼ袑r0fE"B3R͊}6۔Sr l;c%5ňȵ=M306 z/M;P.O0w2dQnH1 r/Rs5{0ʽ Էhw[Of%H1|ѐn{OhH#us⌶ JĢr eneP1(Bj>[v]h9Lh,)Bh;/4vm_5LJxDiM0R 4@Acv Bzl윣k>D0MD>B-%t5?r~6j~PWp{t_WO1dCUWos=퇛7?c>#oͻ۷76&8Fjg~{n8LL<?eRO"۳o0W5;|`lovlxA$ nGHiN(cIv~G7P9:Lz *ce߱3[_ )Q`Ö7*~V~0}Ih>u{ѽaطQk [ _۷gۏ7U?9C]ko9+@vI=itb4>!eHrF=de=-G#p$uyy,*uuOu, Ctc=n`\\S-PoxTfޣycaN~Z~>j(m诃+˾Dv5ZaZ+ٲoJ:r>^Z^qUg/o ?uw77-ayCn7rC,*g=KԐڲor*۲rq8uētH)P8(.m̭[KqҔ%~]GZCai"ƝtjUwlFY(,!P mTkQe~:kRTJ vDSf@NPDsͼ E.xai䘒1D%Y4ҥEB4ʊS@$LW+iœycZQ.999?EԛzzXClB_hj%+oaxf9=*(PתbDu"œĞIێس}/=D;# 0Ʈ$ Jí8NLwb71Y~w"s2_c,e}1vQgp#%DZȲAm( 'G(#o rJu$6<މW1 Z20꿽:~׿,eC334?+L_>]Z/^b\]>Yz`K}0^Bȵi zqniKֽnǍ+Mܦ3 %і=-)1Rc 6f9ژ5EQ9|5IE4O= s/ ivM)J9BPָig`s7uͷPASZ\q]VS+c L ]ƢeL*:h=+s,}-8~0>J䨢VHLWw.:g*[:l7rt3҅7V')"f:AhW͔WWRC88H%n %?-wXÈ\8Gy?0"h389vh[>ok2nWz/nuP|^+f\׃ч`DMҽ W/#-eb V8Hub1͖uS^Ý rFG 4(?_1-vm{tDwom߽S-U.i˷qҳұ+ɕŘ\ o<.%*0.$/r+TV腸9j? XөI;7d{^/zaPicfp+Z<`lyyj0EjU`wW)ф/i5)X px?íd)iLܫ=v!a10a'a9.Je]_GnX  Rq"+ڝkcqBU95Fؿf 3w.yg4ȭim>,FÚӌv{MʮwQRhaarhSF=$bX4/֏+hևUمM:]Dz'sTtzc!#v #vĕ|y.`3k *W? 2N!jEXTHzד;?;oD@!|{4nΟMgⶼi" "T?fY',74ծQ/=x]Ǭ8u<{*o⑳+fiZqaӞ7>})k{-*.3˯HGMbn;e~@;f=X5j,ӖܡcH=qEbY&DŽ&8Ι49QextFV(N|+8u\19#8y35M%KʾiP-2W!FS&A,Yjq}+S) (mVXC 3BSZK  1PRD/q:R!ehF:S#u;˝^iFiZíbWCk@&[Y=JgI(%ZDeTr%SYpl-|a.M$ݘS eP>TH26Q:jU~[,˜lw7C)nh2ͭgPF`Crpp8EXaQ3 CmE2+CBGë$w1gsY;ڬd ~FL Ţ .e-%ΑPm5$+˖z F{M}ɨ5`Ю*YJa;ą3##3}]3qH>%VSX|!h f$r@,!ˤbbjʔ$ ZFaOTQ׸x34@LT"Yƍe s@LV"\(,uYJhf! ё]Hҳ %z-& ʈӈWht|F er [# &p ֊kO 8W?eYILX^ '|,*W3CӐct0mV&ܪ7 GuhPT+ I NmV#JeL 4A|`H_U1f6`, FK J8"Z.;&aQ/@Oa3܂F,#'JE51 I`aDQBh%v3VcY XA9H'P!U4)++iN3*0bErSTy% E PH]LAY]cf7XDI$/9.e@1%8bFF$.'EA4 c9agABuDtIؕ-%Een:?҃,3WNA(qdfW͋BjW*S- #Nn?hA yI(!`HT mк<t\LGxᙿ8RysuL@IT*3ʠ *L{&bvfR`C˿{Bw{BŨ*[}[ǩDs 8z;yf c\YX Ai+\ⶮ wäDz.B%BB4G>= ~h#:) ]DD*xHy@B*1 NO% \4{BwبWaU ^!|G "2eKKə@ޙIokM' +F.l QM"r B,򨚌(јjw3joe0?<$R6,PLX" Fe7F 1sZ}c4Y*L>ZN>3Ɇ{I .fd SQhFw-7|wjeK*d%T_L.ā;*Y+dp`q`)1M0ܲ g@(|\uO8%7aG;sZ4$HH=B&bY #) @W/`h .2 v5Zj>|U([.B-"c`nVM* 1"d$>Dݮ )8tP@ǥ1[ QD*,tOW,LVxՌ&VU̷u|\m\mؕ } ɎcB,HV|s9ͱPQKc|,D}y^_u].sيRvM1\lY%m6uN6c)dT;&K|ry8;=dN~i>Ǻp~|٢~S2^M>VyvG$YM,#4B:1}x>DLFxPީ>Ms8՚ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}ا}^l1}ا[G/J>ϑh jR&iR&iR&Y_Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:M3n٘:F5"9/154οPǂ1&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB#rѝrmv{ݿ|c?Ɯb-ofuoۧ<ϓngr=\L]sw}F(sf4l` Xf$`q½,Y˜}`~V0y]UpTU:Φsx+f{*q_~קxjwBO~0~.`!f+"sHE` 8YOgũ8mhޱ5h'v˳|=BT_V/Lo޷_}.߾>z%Gm&oIw!:} f\LAj9>q18]"ARYxKU'?>M䓕]_/GV\ϣ=k~X)޷Sog;83jqn:Cޠ7&q+CO?&)LEiœ ^qȦ"˲DVgGkI}p` u%U+a~qs'<\x~r 3Wzk7{?V -!T{x2jͶu*)ZE7Vlg7[;׼,O/ľ?J6-]+(~.'o:K9'<2RM^oo[tq+}Wdc\{E0YoZ K~q|tUյ}{Qg7/eP9ftLrԙgBO!M~ͫmI٧ 31Ng>d OՈ\ru" wwiͯI;OY'XiƲRaҍl}`'X:aZ GW,i^ 1)X?BB;=bn# 4Ap /*V _]nS>[nѤby룛yOOȓVYO|h??W+G{uwFz߱Ed[=jHVgȨaC Vԝ\l=fÍ,}r$`FiFV-Xz7cYcxQԃ]^*ط:!`}zy^}`JgW1bY{uY{849-APqn30;>3uҝDôsKWw&wʞןv>pBa\Om`Sj='.pۑ<ʹ* VcY˘cz6G7=V L02#KL8FV0V+bz,L1&2%akѣZ``{`U6Y;R#+8SFA Ufb,f`0l,`-u\ -s=ƕx?Xx<%)&FOeӬU%=g}FG~^w}a&Ib-k+WVB! 4Zm=ױV :i}UbWm qOg)T1NjNNf[g+ +A A=SBre&yRt6( ͂% Gr`>1)RƇ(&A_N\Ò?Pͻ:As#1%ira jsr%c`,FGf)ւB;Jg9#+Zse6XrN%R`8.ef$g %q_w; 8GZᜐr_^@vR)JHhS032QQLJe#\Xqs%`>&3ik,*j!R))I.,[Wbi=#aIڿW紣 ^x1u`tN*҇M0dQWтҥ`ruA"LU Lc2Jg r V[_jP(#@$bB2He**C,h8@=ŋ  E@YR:[H "3pG[]`QA,TG>ON_EӬ(adNI.+II|'w>}wƸls9TXѱ5$`QD>b =N x@6Dwb:EjC$*f7PK |Y]E!jUW52E`BA P*(S`c[ 8A9HgP!t99: hX%Y֨4j|ӡ B2981viwEpALDM jw72Zdp% ~XT JÜP>F/~ )X_N$Jlƚva頖9 q,F%Gh@eo^5JR\[fq/zTV֝`A-B-IρA@ f! iO.Y*%:P` MW-d0L&S3yT,E4V[$h5q5#e``Pq=_n Mk>(51n(A/7C|樇xy@CK6wtƠY.eBAt %0(HP#F' \ F7l֣bXS46I#*dO +Ip "@Y)ou63 v*DʢT(:TCz ց ep?9. ͙˃ >T WMIAp-c|e Cr5Wἠ!@o:k/5sVupkJ! PX?h˙P#,+QZsP!VAB?HQ%A@|Sj21 8%-C-­_^4 jgC1IĪXU?\_:n~nsyΗ+ju|qr…=f_x&'t&o_lIwS@\k_ЂYIhTX=J26"=j4z(XP@례Ro\P#=↛jzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjz}@f h0=掩血Vo^5=QeP;jzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzjzz&yPǂuכu߼P(P)Մ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mun|G߻݊_&Gofޱ_^׾oֻä|JzY<_V+ItX/WV:2|ZëtJc ү{˶|Oy}#`7ނ,Hun$`_5 XZBaKG>y}V V3- X΄|$`IaHRi(#KΣ`)\㱬cK?YfDNۑ́i73/HjΥu#299 H:.5#bF?+X'Xu̸r,I X f$`%bYŭ`l$``v6f3b,r,@۱`g`1B54H°NؑBрUB?wY?X-cz$`pB}M{ԷU C=VIY8?8+k$V!?Ʌ;`gnA}V)n!hף4?/s;,?\nlZ\ $Rj0d#l Cjm5;f}e?>.8\y~K6[ru@X~%20daz~nkM7m >tej~O~X?҇j}oFk(F  G}#K¾CUQX]%3ˉysR'w?7=uۏҨΫ|ϽRQ|nqTe?,/sny/tV/&99fʌ,b7~$`Dw#"yhVz4LK Xp Ǿh7`V[]zZjާ i/@ }G~)x6?e/>%׽PVX@mU+c\`'j륖y`1mjHri<#+,B~$`C`bb4КαXJ=Nz$`=ou{uL!鑀J1&FV(X,+iUJgf] iQK~68cz,v{uQ~1ht1>v(5sk|6a{o|ZƇܚh&o8ѷ1Aׯ^.+EV.5ۧ(bH?W::aG%"ޝl:ɺOn`6tF[cU_Nyh}//_M𞄮{kr`\OwnѝQʻwyu6y4-OnOat6UZ>}^/~^3'_߿/~r֓y6 eA{ә~w~A]7W nU~9֔jz;#df9?n&0C^scy1&|Vn{xBmO> =+z/ǞnC^FI^f{({юX[[.$6ʑ WK|*dV.9'.;-oč|#L^,mLk2;)eT!׎ 5'=$9U>mz;eC#}u-N WS.h?M%-UՌ<ݑ9N K8-Vrk q_q^4~op-D}Wlyrź+/>#z;H%cxbiu!{T[{ ?[?+XB:=k)F~ݗ8˞9w=^6]J4ݠ_4ot|7w e2.Ԓ35"RfpK"UVE@k.Lq'Aq(ol>[Q'XTӧ.([x7`zwH{>JaN}|4ÛGҋp4TΑ#tq³ GP S^wtjxkwG|勽/nr@^pdf:֭=[Эt]x1˧>ۯ~OOR;ZHЩDL`]0N5%y rAQ[s_A T,keOr7-C$Wts{iUFT)tv=d]~Žy:Ӹ'2uY tJS6.:[#sV8{i}e.?ȴtKs/^βd:bRp0RE]{oz35zʞoߕٺ'c;7,{`/&ˋDǰ4`'m yI6/Ml`P|]l[D:loe?N7i[8e:UX TB$'t6w*9c'V%3S٩V;r M;RQuOy2sW) .Tj ρ ۲v]"oW`j/H1<D0&J%">2p\\z2xx;zI3\Up셄;<RBu"#4,>Z%w| > ղ@1T.`ŰQIo]=;B]=xP=7gY(&aR^1$(L9K4\ZޢR(Xxu8p@1WJ!f8𙬶\OQqaQ%S"ut "(09^1hT\V|4^ [KK%/`ˬHJ;ɜV<#'X4 2bJ1{P^CxF,w.?YY#4ZǮ}y+|ڐiU@@! i`&Ξ5C籔>=2_N3cO(NnqZ$dFr!QsYn<߷nR#@ް9[MWm1+A2- -=e zUSA{&4qLÀd|& 6zH~lA(N'DS ed=AnC) |.om[XV4䠴-'埇 ^i9QbfJ[WR09C4 |.+f9ұF6 j%@-%VN, 2D|d.f္8yL;/~o+|Kr)=IД ,_Q8MJRNG>\-g|~~oU+z#"h \jg%:)GFhP@d׏j,Ly'. @X"$ܥ3JkJF&D䷔3*/08sF%>x0YmwgMk|~{|:*@ Ij:;+$^4zch_粼bL;oc"?mbַKMQ>RU6Q^lnK2TԄ:E`d pHP2Ͳ qHteF.˅?Nʠ"9ݥHPqymMFh,G>咪X ,GnW` i`8e\::郷)q{[< 𹼵 FJ$?Ѱ; H1? 4u.=S>"P'<0#@rT_#Ov7O-?7/xzʛDAAsuX0Q{qv%P3T |*,IN`f$HF;HZfĺZSa)_dAжy-Fs\r_Op%.Wd%8]ll\aF,{G8\;P"5Ю1o6C)Kv")fC䭫 t:kn)A[x!NI849/Xq%YzagWUzS ܔ7׋[hsU_Yípg2xjZB1*M Ұ׵wޭa QtaF%b)ALqjAqq-9Qmu:vߩV~Y!JP$?'uPk"Ɂ;#(PxGK.?8 (,W. 7q$o}l+Z-󹍍$2jF,SdxtÕdD!^\ 㪰\HwvXMlMA #4XÞX(,76M~)I|acD& lDNZpF]7񑿘gҪuѻa&BQ' iH+s1Lj>CkC>ɲ^ruksI8%pryydEʳӬbG[W[mQCnoxΙ"c"3&䵣koN YZp#OENEeMM=C=5jZ%9elu5,%aА'Z=wg8|&_0 /b2{ꈶ DhҒ aJKƠǰ |.r |Fm5y'NVMB ['ZTo I?O00u]Җ @`ku>~ F/חnvj9BV1֔v3R04D^4Q蝸]``Qs[q"!$tle.([Hܤɓ7eiŀӛ!!/~ ?m7tԄp[F+:4$-%,"YDKi4j}M̒a |tT'w_n}ɡ5c)mXn#h%/=˞?ܛ ɡqm0tEl=b|_;:l{q(్;J-؞ yz5qאw<{ןpV4)~l}h.|5 N(දwnrR q0}C]cˆYYqkѰg֬Dۍ>eI|' *%cJ Ӡ- 74*nC^/PcRg|%zsPߘ䂚[ycK+^\N& }YG A駱ae,jج^I'fB;0h蒍Q; m/gPr!6d=qa#>FI0O3;}\pGɉqQw l >ĥZ`N{L-[pnȤ~ `ۜj*qnR_\אT?ayOղyD=AwrcsFZ܎Ů {~ ;Rn!;9%{R<pjYNAYۗ2wqaOjlbI3VxW*t)̴ ][z}4m^_KւqZ0.>\vBT Uw1B &t-Zi =? nw,F9ΗR*PYV*]RxKq~_:9(#4z1y(Y"Q@[/0Q̾.N+>ړlAjK%<2Ͳ$pE(6EUr)Gܱu7q02Sxlė-G{747-eS/`GϔYF,eCu|_?S/GAwd`mwS2|et}ԒNZ27}]v \'] m[9oV>fnFcDڝLWnQ%EjKmI@(WFNIa;3>݃],RW|1 F9^͢k*aA aOb_~ ]0.hm}D ~QXmPڜeK?23YIo, )ayZ㾠̱Tڋu /[`bPT${t{L.1dq<6UU#qbÝրK*FvVȽ}lu)Oi~ W܋mxwY x6Q!(x4hq;1h㭷kJkeq kS7M13_yֿ>Orbn>n*|&j$'$*K3y )u7E6Eey.c" ׹ i#@a)]H]QwD,t FgܠMŮyd%8)dcnr M9!qӭv[U;rhO=9#4ڝFU]aow ٘3HJV Vr$<ך 2|5{fO*@"W`p|+z"U أz3w8bRc6qBZi&Ɗ@0W'iVP`&sx?Ç>+NO/o.Tv9XH+VPCohwj/WGM#]}Fb^O4 o7跸NO A:rrrAi}i>}=)LGjˆ'~HãBL1je1fp2 |.Sw4-Ĝ,=۟qQ%r LpKSm8r籑`F&}a&,;1Ng^"X{!TQ>b9aD$e_"o8/[XÛq¢K'lʆf1!\k;!D|WɒRTb-^X kV:#Ebs@V$]<?4*;BdWCZevbPs+Rxx``cՏ lr\l,!NPF(I.Bi%`H -Zo}90>圹dJ-Cԫ%W%m=<:Vv]aos$jsTGCbI=ko8E]̲Iw|؝L}%R-)N߯(Y?m7$vEUŪb=h5B$b٤{ QqU)IFi}k TM=p@LLI?Kd,:!2SonDŽϱR_9P.Y E˅T.rzFeL p e_>gEy}Db!vn8nL0/]mU+!Y.ˢi,Sn^Ps}^OHpL/.T`Zݕ#Xܮ׫S*"S%k\ਭ=ckf{vYht;lFWʗ @Dg@\ C77ky61 GYIu+"FZ b'ط pP{EgX] a!_B0vcQ;(&nUT0(Wƾ8GQAEI0h (_Q04s4@ӫ`ko(Gu8hP?V3\gnNUNb5 ? w4f # ˌOmT/%HV:b;W~"EzK (`81)YnG.6tql<C|ܛ9m˚5EozL (`즬1Ʀ6畓ڶTLD-A1+&ò/@p1~A qg@~H]j(a^>U}[Qg:g1 ;yP:cn2rB`讱0C,?ŴHyn 9bBKsKKe$HLmyB!3RA4]v]1C PcUq8^[d>K tߛGC ԕ@d`yS( c.׆$J3 :G$WGS*5}Ʃ?{zA`^0 ٸ?,wd  Rz0w-HM~09?t yiJ\T5ٽU mqF"pw (F}HYE'a ߅o`'bcyLZô[~y)0 8x>ps"ug!HL,IZ. t>*?cJ4}2 u<@Nڅacxft ݹrl9@˘`"R$a!,/Wg0wxC,a -0櫻\y?kH-_W?iR@W!(F>ı]q"Bmc>7z1C[ u,oȏzPQvW-(YC;ovxl͐;4$P/p8rF$7P$ϝw*fo+h(E$C: 9919i첧!J]Skh:Oc>TE?KwMdAA R29tΝ,1<7Q׶8fu+ptPDbιGw'و#(s$lVES N):՛( PAfEʢ3w zC] Ttpi}n]r+7H˅U/U>q}هM:aNvԚUP҇YX{gd qy{10~?G^. M&$451HT؂FT12!8 ?ΤI|SC0X2Ar,ֳ8ÃK~f5&c42ABoH *?7#&ƶ!3 V :)q(W>@㬼bnﮧ_ZX,HeRtM8QS5/>M;Ƨd%Rs3zȚ9,T mJEBAD  ~K)5tJLp-Yڄ*I>dS; 2;ymҶ[G|"5Tճl1+`|[V/|gR\W͗6Y{wn\fGUxSv@J?>,y>ӛoAvZYn2p6f31DTgꗩ?zwEqT \VTyr uwB/ʶ +a}XW.Ӈ͏? ̷DU~?,6bqݿ=<"л jUDYvՍ\?gC_onT8xj.Ps}tL|67`o*o؝aGi+Uqs}725%ˇmv 0T^m6?Ox])Z =ޑb*M~ fƕXTPYU), LM+ԗ$a$Y$aAôøp%:$#3|ܠcmw$T(!TI[=WgTNA0I9䳖aD/\c<Kyha@3gc-|L 0 U|q ?>TSroZ^0._k Qr?\5rWl\¾A^6i5Mx\uST.80=,-faCgZRX [ *'t(KA {6P=Gy&l#cޟ)5q$ QFN-[Tf#S""g$7t)3rO3Dϐ" ԦA؈)`AT /Xla♋+t*g,d|Ɨ9ٻ/~7(cj '*eNm^B4h]ٻ}1Ԉg%,3G>',kX,Yڨ1;T''g>RcJAN$!YKte VFXH,v"Ak&dsdSk_'g99Opk|{SW/K&gNN{gNϜ_f-o89CJ\&$9 P42\R25<;V a9טb^t |&l, C9t7h/qTLP=#-mq2Mrp9EGnRjB PxP]2N8NΜaCb.ɤlB6 Iyw.R[3_eNl0©wڂ(p& !!D,ǜ UM]ȟSMĬb*ba5&;n"7<+up\)-m*']̂h+q6b*filY~kcRVVp'\b +EH"HC ց~-;$+3[ %JRr"? #-i&FXf9kqĂ& Ro|Ptlx,LB#1sLQ^:ܹFZ ;OLV0 D{qw?pNٹDĜ'5#w87ÜThoOǤ"^b'Klo+\VZ~ɥF0Yf^ENΒ\N|PB#1s s[rr~x뤑2~m( Hb-Cp7mw(h̽sBuc1(x X:u;x6Kich~yvAp$ Be0}'gEH̜zO.`ZK Q ΫdRAGV43=ҤTH;)3$ey`z?3}Ǩk쐄:n9 pͧU{zB#-s>},>38C 9[xK,81'zȾtqy;z-42Q[<]Z^ ixjDg=!\$C=,S}KvA)kUs5XWs˞)O}pTc7 >X U(B#-s@qSWGzp=:<}bɣy軦t 1矏mqkCÂ) '^b\Z>=laQ/rK[gRHxSZh+5̼{a )(oY2$%^Wsz]|*pznB#1sz8)& B0\,$i>x6(*zP3z %l1:T׍7v-RT\2Տ| 9:XNg#( FzF^0`h>.if>\a>b_^#˛qYݿ}`T|PKm9hVc'8n(V1 |@C4Lz[`&Lc,O3 @0~R޺ Xە(16D5nA<'Ȇ=KQȜ$rR_0[h5;W{5'>Zs6W CoP%4kV$H`KXEQMqZHKq -ѦAo kih3$l{,!RIyO]jԸѡBc{=-q|[n1oEn4sճwj/ PFALXD9{oM^'.όi|v1^Kzwne2Y4Ma<ϩgk `3 +ލwt:94İ 1ŹR߷j~^ayZk(<:'$cK_0!49.3W5nը fj1 n\e~<f1E('аM-~|?a>{HRzZwC?ρ19 w?O6}o,t{llG!%?BIejuۭ?q?N[A_r85M2,2,g J '{0;  H/zbcrA0LXIo=`{f:`tƔL>Sƺ3в7CwI#.]١l QG1-斀'aA#p+G;QRuUb={&EgqvPT8(UJAe=f`Om_}nѸ~B}S$7[V_*}{j6WV kn<"Œ 2,L(,_^*n֝LJ5 ]׿`( 9Q8%tN/ϦmO ՟'.Lo>,blǵT7{ݪ<g?@^}5Vv R]6T)th'ۚ}l31~S=e~am7[Gz_|}_d6_v(Xf)6F§1,BߴʸOK2SO'|FN [6rߥxL3惶|_[FXC]6mγ^U=]@q YBy½o~Ϳ~ ~|=^..+;^Z:~-áDcO߯Y>4:fW]oڭ4i ƭ{xN~,?M>\ J7؄jlޓF+azHոMrO_ sP'7`z6LWy ( @{j%5LcuLV,\ $OiUt{O+̛:0-fa*f=0r&mpMEy|h0mA 8mt[k0,-Rsڴ^ICo#MTc ".k>6|ŧ=0ɈNb.oḛ. foy!oE ($YǽMEGJdVXPv;|01_ 7 (h=5Պ #-1R^Dwi]Ja;٥Ό]֥d Re4]o;4n=E4ɼ .i!P61^L~fnmiKDjaVp BVM.6gNXĖ$s iELۛLWaH!2O{6Jxc 8؆aAB#@iCBacDa\" 43SssXsDփE$ѡ`is"@y楕ڗ*lZ䊡+1L6øb@<[e(Hڐ*MNˌ >7h6WH+fIdemh:Yy=SPnND|iȵU=2dlFB&J" jΐI h'օۅ=uJ~(S幇 d;2>fJrcrKə'+7BHGa@mғIbh}j~G-Ìߓ'ߟ QR75wOtrgdb[ dxb7xÉ*x)p~0N=ioc7E`6@c 0dẎl׳VK$oz%nC[X""}a[S[`p*f R4Q[w~Ce?vv+kGaq3?գ;8mhVu=dlts2NwUm8  ΧIHҡ<*>Fn(wkh@g ޶ŻԹ~5\v ztp4twaH}w5JrПe9]/"?5+e@ LڏO@BqߝTnZP׼ ;e[%4jUDL>`3Ҧ?yT6R3otY>MuO8+RՈW*eE[W*V(X Jp x(TA 9V9_o?1PٔpS(0b 55 FVu(<he9-#-–>l& b4]G]=ص<_D)3k-=!oa3tLYTtK?8l+ Kޝ++)$wn02(~mݛ~6tQ$Ҫ H5PzBq>C}+'2FIHdlPR rt -u~:[Z9qZ7Tgv!d]>Q*q\kI^2`r!Fm<N}8ݣD-?m$\/Bž&Z#7`eW7' Ra'mr2${r" ܋sXJVb֮쎳MR@7I'WOΉ]Hr0,ҥ)evJJy3lkŝTqaz|9iHq6#Cxs9e9eQwe_;PQV$a.ځ"fXHC듷LSoG?v}9ʁPL# 152FQE05De,4nk'o Oz2㰇,uDkAV$˼+m2q Ċׯ 0./AùEH-J/҂uhֹ+G#7hV_dhJCEi"CԀRbFBp6v#kqTmFG.0b h úg=zG(^{ܤuHu;.W8W,3f8#s+g Cg?ilFFùFLiDuj6jZܜ2l@\ 35 b*0Ҝ +ۻOr9i4hZ,= }2: 9]TΣG_tޖP [xT_`~|6{0~j'o wsf>["L7,(1ҦƼ-AZtt2pyv.BKBɗۇ=Kr<-0Z&=^?mPom6!᠘7ԧLZiOiOmTZx}asi?Hp70_5h]ga XI+Š|CܬoZaoEvV٨|vߒ՘o~+QZ]3OFZ![}cݚi&8~1WW/$&&p3D~< \WH?q4<ƤX)s^+A+\@fT*[UXƢDμ~XèZi~~yt KE+=6 gPEuSF u*0N Sau׍]Y-K.ﱡ?>PH/ +Jr~eꝻmwP5^z>z?wNKv[9UB ?>83޿?w3j[yf\21@"F߽޳/KJ%:N{? ک{qLAۅt:;?FUoV:;F|6^RpR4mH̾IB]F.EKbnR,HpNP`=2HT@coJ^RUHk.(J vk.~|I#Bqܖo3o6VZJZ.0'ry_!H ̛NtJ4B`B7xzPڽzv-gG(E[L 6o!K|+7in3f2`vn&5mOhxSJ Q) dT :s]zMhR?#!$) nԛE?(\f8jhILc5*`Zcl<\-06,l[)ꛭ3aM;)-q$Q;ϮT^2ݛ+I7R;]Jw`^#*DY]}JDm3Ea~[-;/`qd;Ѕd(󖱃b.U+t턃E[ +ŮaJ )J$b`ҳs l˙]d-er2_-[p&[,"gRl@Pl!&%e)-9t Ɂq3T 9t !v{"|`/ǵ4}F;eJBj*{9}k)|F3f(|*!g#v.wPyg؁| %G^i)XjUTV#-E`v!{+V\tIJ |!3 @>cѨ0S h.RW1b$Rh93@cHas| %'Jgځ| %Gc #R8JiH%˨i3C őϨ;J /$f)293@^2ZZ1S]AVSAA\9b8Cw 1C Sri-$ H '$--P8i܁| %כBeD>3f(<9-b/#3C&\J&A\ڷThVҠv| Eצy|3?{Wȑݿ !>ɮg]q1SLRȆ{^uT"-4bKmòG߫U :;%on! J̇ݒS&ba9%0J=%D&[<* R: k`E 9jǼAzJ돘SD,n06-`^R;%Z1 F :-(9!M̷h! b 84:S0!qiv N6mZFٱ}#KoB9ϼBABRBƴ 6Mi!`8w#*NS"oE)iY6-da^gj w&SbE:bźmZ<5h[yshB4Au9kOxiB)ϾL`2iHK1/Xa56-da^M*A.Kq \5rS۴yp^#U}M`qɬf40ߢ,[3dQWH]B 880M 9({bQZ(nشXM 9˼Cuw.JŻR*QץJT XJkꐻθD.]qWZy%*u^):FZPq3=\]=ZI]=sWwWճE)N)!w :qWZ]%*E^ڝuWp|.wbw5+ITe\ƓO&/~y0E1/\Yd2K[:bRiPLa@bPȾ^Jp筙cU1)+(AFµ" (jORdVZƝ{T(SnSs#VVeT!MFN8dyvQĢ# @BɔnXQË0+(ӂlM^rÜQNd 0#R?m6T0EZ$CR2)̳胊D'DQ"]p3齉$Ni)~A^ djovM!rFBig|0B`D$4{TBJCS$:'=wМQRKBC3bO'R 9bB^1aP.Р̘HL@<6*lZJ zgglp9.N?n4[?(.1fŻd]+Y-h, hx^gKsUߺ)g3 W'wQ\cT >;3Sƀ!fH ^ wE%$*$曷+rV7SM1\§!Tx8yyy|9mT0M#]:ELm&.}hp}cJkH|^X »]ǐ7_Ǔq])hU#]ӕ]ڬ֐6hk@X.lk|rA^7f~j\Iiepܒ|S]*ߎG`˥\t6 ooP,\󢋁A|}f5}`"Od4#42,54]0 ҡ8L]t#}Z48~>] YpV›0\AKN<;&|]^ USO=Ӵd֣F[-z;P5Y~^-鰘w.8T䬁!m@(=@zv OLoRu.y?Uߌ `yK-)qM:!"f8nV@}\>T#2Z]m =:ztJ ܡ D& O5yY!Lyv\J?Az& LH˵s1rD,`yF+Li_GOn:}~L_G#V~őd:J.s%JD-և^)/QYW{58T~_[~V~_կo{yy) W8[zo[~V~_TȾ"j}E~eG_WD!J -uVuJάH*ޕU@FWu$* WuU4rW *,JTG-zwzF*⓱?ds FPM0>`O?bILiUʑ+H3o A#.5І"xgPPJxe'$ns7N^>N. w)JỲ0r6PL,<-yF:H*0Ԗ}_Qeau6Mʨr74ssqf7cƪWn5q.|]1JWO4J[*K/X):jOB,-~ƥ:Iquo I4"d2A~ݠ:ОYYn(9|Tɖv>;G6>D&-FB<"`SK `TXJT@1c gÇ_Ecp  43 /滛'Bvr=^:|@2L.z~rp:GJZa_絁(ӁAe_кOh+ $ &ցƫFg{u&viF4]s#]<ɞN$$VΣM[` >yH& 'lt5p}po4ňE>/]c .F"O1wa&T )sɆзvw7۵EaApfR`hg&teh2Q+ȡM&*U?5M*]]B,*C**>O~giG !*i( zA c 0;uvx gJ*~NX8LJ 3݌]̛XynBzJa},jsrJtKͶ)(f%efV[/frQ[ w fZƆjJ?iyks.M W-?49}.U%Z뮍k*|c[)UQ ),q%R;ϮSZn: gaN6U0*bX&hq^+[4Ox,\fP ViGucZet\=Mv 8yag'#˕;WJs/fT=>FK]=`G.}珣qW ~½U0$PJbw5+! ]rWBPκ㮀\):㮄`O*9*kpWT/f+ZUVLVYT{3FM2J?V~oPb;yz>3*ͥcL +W075b6Lk% kft¸t毓_Vha%8]wRY l[zQՖf+=+0ť6U[8 Qu[dCHS+*轃X4pY,c˳n?,)GaQVIȴ|E\03bBQo8baӠ>[*< epP*ݮ /H"pQ5ji43ΟT-:R}$A!DVN f N!m18Bj[5a(J "QGT ZO$Rxc%¸] j1'hw3 ,YydȃA vh`,7Ddll6Jb"'2X(QHYuDw  ^ˈiDk45[!-[*5e5䴒r j-ȹP]]sǕ+(l\ PRVN?m)!)[= )X11µ$3w+Tmdiu͛h4 cĜDhi{mX:ɌaJ9h춸ѹk3SD pzGaCDKF}e[hcVB\Ghv=m#QѤ*CdC4yBȥa1N{'Q8g+ZAC6:r6hIoO,&C*R^ ,s.lbqMOb|jysYU-wTM%9ѐJJ-At@r=ʜ%xMoaL>,QڎMѵ(%b'ԴIE[RHHH I? ҋHChɲfvLVCJ Vj/ "$#զlR\=D@AOI=j^,>YЇF.\>E=uk,R`榤gE¥:V@FY)dWH! Q S( ٥fّ#Dȗ?*! rLg'`b7{'0O4vuh;۰V(5 I&jeW+qA[4/AXżۆjbPb$S`U0$7O7 9zy'K =`Ye>ZbM=A!%DB>h AJ./sgmG]I@(ʠvPJ%l-ѧ@hg]+y r.!z-bB bjw!U3ڄ`1btGJ߳yAP΀He#kvqAkEQQIb&#EP] C VA;8*jV|,*Fa!dE(3A68FtblYNX)j^ 4+itԞEw64IxTF+ j@eV o-JPti[ # XHhF-w$a:J` ئQ}Aw:[-)K ` ڪ6< cs[<6/]Nꄼͷ2IU&x[.n[`3 .-zLwk'QEŨն[SQ5(U/ yHY=yh4vM Ƥʫ ArT{R@k80)QC^":$5\Qr#bs"Q'%\IbLvP%(H Ш3) @)w!kQƮXBv߁yE!8ij)ڥA$\s)E bX0j`R'JQYTPcD&7cQGQ1abRuL` tAV6"R 1tFͩm X{u +5ziQc b %_T4/Q4T Z،P-EC9y9ܳNKqנB]h5у7[A[F{E D>:jNàe+J 'zʤ蹐X>?Д n F5>dN֞ +JOQ!,5\b$*UkE!0XoCnԘM5Uri J.ȎYxЬ]7 $Ct.Xz준%a3EPFU~B:COW4"坩y0B.8! R GWE?wqbqvB4My#I9 4"B&Wo^^}bli37P<ךz7nxw/QOz1] #;F)9UqO~N ^! hu@GR N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:Z'\TvBN d@7NL TBCwJGHE:М@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; Nui$8%'U8WML Bk;vu5b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N vZ&`?!'8'83r]N#r|; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@WfszՔZ|\B?@Kkj,ޞ_a V9%RRWq (`1>zaBt#m<׀RKtuteBtbJ0"~2Fwt^-Fm&EWN$^M }tE(HW.Z̈́ 'ױM+Bkt| N} ]8bh:+Btu_ًCV_P#&2~6zB>t 4t\mvCkn(-J1]=vpΪ {)'CWWũP l+%8v쟽 nf*tϵn()L`/dɨ+Bk\+Y 'DWlt pFqkWFzҕU3!Jj5\T hNWk2LWGHW6ڕSV<\'S ::ځ ^j?F* 5%uEt BW@+9t"(Z%b?餫9AyohK-޾i`N.~.s\Ms7󘔙ZhAdw'߽>?٫Ug,):]vϧҋWM ;nqН 9*#fnS2ml7dyI[_-|s}m] fPyMqmhk@oaNd6`1?ԟk %M\)n~wsduO7hߤ-oŸX}w}۳zK.Ybԯf\φ|rOn- 4Ǘ ~.'-^^cfGNl{#>}"$ެOŷ}}#7ݭ؁"n6wZʍQvqVQGkXq:}O@>䡵" -KіΏ?3O146x 6<.#RJ_\tIZlw-Uǯf+P0|H_Z{x60>iy|!t{H(e/h =n{ߣ߮4$?AJH1PdB7Lg7B$K(d?PB{u $>x7un(w+tءFx'DW~*t' LWGHW(uCT+MSgZ{ȱ_1 l.N&|I13HjI eIKdʒ*4NJaK ȱNl tipi ]!?Ja(E %]i)yu=5UR٫3[׍v#ۃ]o+ӚW8XA~:IJ~~h&;t$P 2L)e<\)gMHl I/RX?Z0O_β8kNeY#vo~ӛ} ;m'-6NJ~,?F~Ư^8bS,f?_쬝G_y_{Xv~İꑲU7omw;5^Z5+ q11__|}$ͭҔfߙ3\75'{aƣ?f?=a%o\|cTdXf?3ۘq&Zџ{a~C`;EwW}?(%)X:Rg-ךRO  4q`7 >i 8nV̯fpf7reK,~/k[M0L\M^g/r]ښnbeˑϫkʁŋn<5ybKC'mޗ^Wm=?b7^=}NԳ݉{D>ayn_.Jhg\dC9~968FRx@x0L]^.WXgۿD(4a'};̠Y]_|b.nOoNN5w8b ˚+2c| &>Pu赠wwKuKu_m ow-ȃfk5MrUFz?}}y)qscc I$2xF?][ξsʡjyAyPǴg'PY$1$ɵ\s쨂Č4AȽ'A(7 ڞ[[-)~zpϮ$$q!F;e8zMZ!HmLRBBjg$8 eEX02KY-F&JK"'Ymp6ge Yi*>D_ڤ ؠz'LXDRx#H (m H`)v: ?%{7#?8(=IiTW> JQżOs SJڤw[?fwȫǦovIL^R:+g܄L:w],D4ΫA8d8j&ÒR^E(pm-!)b %ZcǷ=icd򊵑' 5+< FoV_D#JDv|Χo+ (֡!pD )A"8MXIC^Q}=31V,dޡk.s{FfWmhtBRHs ڵqGZg$| NudfKZB(p4vٟ Jz3(9fP+! (H"QFԹ`$"I&9QZ=Nz>/pK'[Ng3(%RM|8Gq&2GADr,YA;SFmp֋#a. QԻ[.r;qTJ0#Α?}5͈}uiAk, Lxfc4A i>5k A MkN;Uoġ֦tV,3>&ے9;,ѷs$sx<[Gw<[U%7ٷzY8YavmeL7,rum֑t-9GBf&U'mFY9As>9h$hL72\јl3ڗ>x( m JILR42\BSGB#]ipj]!` u h;u(uKWHWF$AtU)th;yg0mЕذ%J{>v};tZCqj;JlAW]*Z7TN7;'e_l/>vo{9A%>DžE^\Z>wCZ2R1yg'DL9|U헭9 w!ݴE~yR/r 3ܴRN(V2G{g:g^t4qσ-MwO-v1_RTBxb s^0z+w.>&[JК&=,;&Y!8 Q 5 jd mml-I&=D<ld\5ţh%GQjz4ƣ . aNWGZ G:lv(͉lAWծMO h 552ڗٰ?]e\tutňf'd+U+2ZO2J%Z:C┞⡭S.CWSbnVSRΐF9XQp942Z8UFZg .DWjN=5)th%Sΐ$U|$մ)tњWWR1ҕb؛!1t*BSUWgIWśʀMs jʛBW-?yuQJ9`yo{/޽tu~,j bRo`5òbY_ o ?l7[:+}&Sڃ_A[0wzQsO~"QQkz_lG,wv=cac޷T@r_ݒj;q6Qa^ P?Y,ּY IO0C@qqw-ȃfk5MrUFztz>H\ aB<Rx2Z3&t$6h{|j^Jk8eHp*)c2 O5Zks 0%ቭ5GEZ(*炠 O=1|F rg1;Uz-s0z3OqyqN1}Sx{84?M3C`9T-}vI$3O^G8ia57TkbQ FG JJMdAkh"( ZB%pc)!=CL+-0\2.zM>/ ~1wT\V pm0·RHRO3EG/ٙq3\YhUZB(\ԶpR.*NY+ xCȡ^`/+! %:xNUT:D$-^9*!Sk& g}It^SGake3h>X*RY T_Î8<#E:::2U+֭jnOW8 G~-At3[HJ¿EG8:8J+HHBJ,yx4V= BHh$Ƹ2Ox∪<9C*&hșfgp&2GADr,YA{668r_(-xu"n6wR6ywyOzhM3b_~@]e l&h!GAXXJHlZsL;Uoġ֦t}5֖ ̩9/gm#7Om#'V਋ֹyU4VgkWtٵKT6lcIum֑Ay0ɃBcthgHG\̓p>y"ןOrB1}>ũDBRuP8H$9joW~wq&g&rIMgw܉ c5Mk j6ljED` :ςSp$R2&2&K\NvȨ2ZcD9VJ}6oNNg8B dM}vg,`3aboP] E2((,\)1k6EiXޓ ΓZ[f:p$$APĀD9 @o(UN1).+jSHJ̑f&9R#4eMX2+YFLQ*8+ֳlg^Ԭ.?.mSyɉO$Hu4@+ )I.$%bEhUJSr/z >S6@+H2rYf)^RO[?eaz͐<2iG$HhuRsj&˒^G9BҌ\K4iM"U8M OfXE DO|Ⱦʧ+V>TmXH@4Geq9'Im"Q}$A4\Әĝ[oգ:X;coIȾa*sI w^^糆%[rWv妥{wm٥bb6Z.}xp$ZnUUY,6Q]"5q;fWJ}ڸPښƩjT[Sie +|cwkTS jNWʖΐbq}utӍ++DS jZwBFtutŕT%KUB55M+@ v QKhlJhIh]`ڠ卡+D+k]!/-] ]Im0 +)m ]!g]++D+l Q*ҕ2TX 6'w+YLs+ILgb 0a:x4]P|CX ki'MFם_MGdGnQ%bVe+96Khy3mRz-WVy5ŏGÂ!XaQEE L!"O,_?3DLfY ޑhHy{Oo7bM_ Mu]|hfWY1S@'þZ-!)r[ K2Km\{x^@˭osqtXcF(opBzƽm vC q8\}{)m7Sq@EbR.XLK1Gunݟ.Fd>zXX>CBY"D!ZS;!ӏ;_ڮҒSXEhc_u;.3fч8$[=?&,U`+Xv7չ||=ZqJ XIwtaJFt%[ĺgl|engM@>+w(D0],]U(e}{|Q&BW4W^M4Dw|`xV͓ կRz{ tSŵ}txmTmUZ2L[c=S"L0I}3Rڪ9~LE,"bNJE?9}˽ 5ԲRoz])R<|qVwC<.›,f?tCK8!Ա-`g[(~g,r *ȳ3{^ucrUhF2B_ !6v}ӋEi/C4E6 D[n{Jb9밙U-/yfR$۠%y<Y&8 fdL8y-Cj% ZD4eѲڿ(y9^0I%ǿˣ^ÇS)zu3ŤۆR83 G)n;ǝs[6T36ŭ'l#VQ+ Eu2b&,wѻQΫ屈;w*;N-O?ı}ǽ_֕mԈu ^v~J:^U>UPi m5_,%h#n*ZJW{kZVa<Tumpl)>+cKPTip y/{bLii{~ֶg&=@7`{4QPv$Sw[۳n=ffhD1CSk gM=+՘D(5)P@+{"Qv̗[N;X yb <[g'Z ٺށtKWN=#&f4.3M+D+T QJ'VfMh ]!\䤥3+AlyÔV9e)thEADzWgIWRd 7w(h JQaj]!`CW4&h] Q2ҕ&yWe5-=]!J[:C2ZݤTax7:e7fUʴtSf +̙h ]!SZY{Btut8F5 ++XS r*NWn8\4zմt(M #] n(mR et(h J 4m]`B1 5+D)iKWgHWJJW Ū+k]ZJl Q2ҕy;pyo=unhE+D)9ҕujcN^ܱc)ؙ-7H.eM{Y#al>vR"JVdx>NLKIvk~\$V9ij|1[-:ˈJUhΛ$Wդt(M[rn9K0C֧޴i7Nh-9Qx7֋te[w꩒1tpi ]!c}8]!J[:Cb/wA)2 3onFq|sYPp~]~s)= EZ>Is>^;ކw=$d.,$ 2z4of "(>4ܔ^:΀n|b"~TХ ('1 &Eos:n<27z}qfKcEVb;F6$<>3D_ħٯ۩>ty,IHR|-AMB퓤OQ"h4?CRhzbr{Io2W!քʐ }0pl6  ƙ:FEez;AчAzp+ei)\3\!|%X^{>Ё+3B$dlqvh[he|~OlZ?ffj}";dG0:9IgʔٔT<5R 'xߧWs]KoŇ0|0xyr/D?\d0\_at^b'Zv1Q1\5r7]B%7Ca>XNڻWK_}S4*7f@ieǜpDZyISh`. x4R m,YʐA55ͺd,ěȏ%(~+ǤP.([*h:Բ.URnnqWW#Y}s+HBT$9c\r.YcVBȩ`By4 XNl>>PL&uV=Lď(!'^r&Qj-zj[P>s(nquf]I Oo/ɵO{0-zwC,2$d͇_FvvWuEp֟vh@];w[Y5qSYt_zi=h?pb4}M n⤟n;>Nf_?\vqzӁ-ko!\zXt8K%^;.U,LďT_2z)? ,A˨|wT} ZK뾌 (29.Z˻U#S(9uh)\FyH$!mF|n Y$< B122"EN%=#9(d4LZ"U`ƟD+hA9|.Dr,6%aQXJu0S]gep?CU$|z`_ ?A}hF! {-^nC!㥧^ {!4-zLsRRY$1$ I8(,h)1k6EiXޓ ΓZ[fL~4ճ=(A eZ;y AEbd8R=s$9q &, fZ[•,I#&(GNYepԳB~jVc CQ)Lh'l:D\zV ̇ CZ"*P H)9@DOXrL$܀w ,@Dƣ[/'j&˒^G9BҌ\KGV+]=mt9V0 m 7BOju>z6zM=)H!UpmmeD|៛ &zZLDU>]*bU"x H'DRQ$A4\Ә V=Q3V2RLBSvsLΧhXpsCpKVܒ-oy s#r64qft:YU&IYtL\Ϥ|?KTpRf1Df` \+!w؟=\oy|;0 8m,QٹfٻǍ$Wcv+cg1bۆg<;0oNno[q! +f?~Y#Yv 4 Qih)7?~9_ `ha)1H4eptnp="ZHtNz`(5Y %9I*A qv d =!"}*X{))qJ7D b`9 ƑQg9 @'N'_(?4$3SP-a\†r^dԏ{Sc ޴O#&M ` VaRJ( o'»DoġYM44To9 ߜɿWwߴmtoLٹ9¹Yj:!~}~\گrx;gBZCո$*.3eEo@7cpLeA:st y`G6 m@xƃ9ܗ #X37毁ȥn6~Ɠqá{{] |`A}ãyȦw.|QZ(>M;CQ[U9_ت kl3:+AX]M 4]ʯM-W^&R\ AC{1|ǰ>bG|>b- $^Sd/0{uݾs82fL#wf]´`19- -AƒEkC ' JS<9"*DlU@+@0 b!P1W8M(rnSX "erbrX+?"D4vt!lk Feŋ̻q )*{O }{`"ў)fs0´?`6NRbjټP2x?!E,?F"Ncʠ2zU! ykb2!k.PHx3jg>LLj9HFXp4M{Mm6i7`tz4g}Z}/ekIW ޹^0\& l&?NƋ6 -:X|5|$~_yQX\, B0j?߇wȓRzQnoa` "*fB:֔Jՠ\V壾.=Xi~;ŕ *yxe-T8꺃⮏q5tܘVcY|%WHAZenɿ3$~\|PWX$)3s-\!gP jk%'DIKyn")ͣ(4g`- R}/̨dGaNt5mՂ5v:'Q>|TG7y g:y >-c3ϋ݂ѯES^M |Ƶ4;vq-P?Zٕ_EQIz{,ޮO>jdH'ɤ-Z1Hnh*`.>Z8vbV\{gnqc;po!vM'cS& `^ Nq2hxƊa1zY?UW'Բŏg75CǚSnoȊ簦ixƫ44iv-`͸ע0}8JhIO~\+ixͩ]Y`X<=duzUɭ"wmג`I՗ Q;jbKi22#r]KַКϰҋ vsv`B.rS&n]z rDbC:]3ܛ= 3֯Y"JulsgܑBnNs#mrS=`ė}:˹.5|97:!/dU{A[ITݒy{+!7FDP6={ĕP}/B 0T+~0n0|<?V !SJ"6bS&sڋnϑ=.DD{n(!_ *q7b5Y[zLCxXr!ҳIk G|MBtm 5R՛7;9ҴRwtx\%36il$g#w&B r:qlQKakNpPEag⫤8$e!I=@/,p<\v~*r9__4/# $eG {8# 9DY"4&2(5“N N"iJJpp^.V@i6(҄HXZ&3B ~dg9\p2K𪻨zea ^DҮBeO5]]>{#K^ի-e@(e9>esE U0*AeDt:Dk+s3)mtZ,Inf:t'Yn3G@(H20Na"y9E&">ʙAPI(Xba!X%hQ\1@"irtZY)W1+ O!KmD; Ɓ":` c9P 12L!Oȓ4Y Z>le(*]9x𑰗K ET;rZ:ju|'yZ= @QDf: XͨI$TɊqQ<:[c #6f:pfQ I M( a#V눋3.Zxu#sC~]uD^*|c[jbE,q.-(s# c޲aihC!*X*/Z-fF<`l>RPK7Agɰ:3j-௠_V $RsU@BnufXJ MVLC~ $[{ C>7c̱b>S;kH)EВ)E  oL7p@2 ?L}F<;tE,-CoBD7Bw2o/f*^Yq8܏K\o\0w!_MjwhbZ^BsH{Vԭá)]@ɀ1T)b+t@Hm'g)Oc.*5X@UJ6&hz0ug(Ϳ߆R~Kn YT6s/YGP>`k r <0<0V5vkՠ'gi{8h@b;M/-\X}f4g7Ѣ0}tz?nZ0|0׳Cz9n83vmW0m jI5fDMLiTT=$rΔP==nC9Q H#|?O( ff93N$ \`2'XXB*D‹פt j:Rj f@TH%gM0&W:څ:ݿWYzgݿ `\ت~Vp%|حӧ0GO)RԿ&,$a?7fC"R?~Uɱo}>& HA 7QV/]f+a\L ZJ W4|J15R_%D~N%/k`2ku:ey&lwa ,&+מطR-{Q~U]^t~WX1juu7x0oo}?X _D Rܻ}.vr7XϓzwX~ M`ϘxPT"==L<eӓ'[L,iɠH\Yۃ )ht!yd -|?[3)G罵IO<;Wi3Kʼn+vkd&#QHY5M0k5f,`ZFL&Z܊ ݚ?<ަnZ/gbq9ӌ:udHb5jxT>s>`.0 KZ™J1,,VJ\ь]6\4DnU pv^z1˾GJI:n`5׏TοLքh-R܀+)& [)Q[M*Ffd j0DNd,]-k=n io(:k&N ѬFP>OP fwmUs Vc0p$ØE`*)HlA}Nu7Wqlr3vݬ>]g,Kґ0хTm؀a˾ \HLg@YGe9AQN% =rl4 {Œu, 10BB23uS/IN TPHck6-eM˚vHM;i ;RLvהjZrb3k&ݛw1`Ā{d_ =];X_ّp\Q4m@EQ) Lը2giAIH&ESn[>-jpA w#cXVpeő\IL09Qk0^PSZkT5Tt菋JuEE}lP0ĚZНϸzg;{i}~>BbIC+˜RqU4>_AXG{-[ ;Dʮ tG“9OG{<ƕ}HlNeh&Y>맼贔ڇh vI@3C>v )8v*|GngрmP% S ۙw;="ZHtNz@Xr,D IwEBR7MCvb"{&CDT KI S"!&"ŜFðqپ 6qKZv>qΚtI9"vg6)x1!h0F(hn1x/֚`.T[iWemEyrB|MdG?C6]X,؅c6Xk*hmT0܀(጗.}&D3zǓw$qMU+oFpI4W/mU:bQ-o8Ez[sG:Kb/nW0bQTY+냉KM1H0<)X Z%~thD|zv3Tmov\ǯ7_sT9[&ͳԛm0=&*|{Ve߬)J˟FiգUOl+ZC-tm w3+sie5t^YRtdy1]Wn'mD=ʥ v4 %oۆK:u;xfpyŠQ>q8q7/Is r]]GY׋\ž)a$q?RTe>_]W)9L0+^`r-# dRKJ1<q#Ȑ% tm-K響OjZz-'r9ރ\soÛ뾙 Z0]{~鍮9penc+ EoંB*g0>m( \9g)yY_<jN(gCy=77pc|;8 % o΍.mtR R)J{f]K?~/ A'ûԈn#WOE-N~TA/*UECE90Ub)|{6.j0u]LBIB?"lUpo6..|7poyqTv(֗_|\Q~yW8`¨b*?xOEEWKe|ggJ Th|_x:5!\Ry闋#R|.KRy1W#zwi iQ\S%P17|77~ݣ3|_@RߥUf8u``P H,ٿxނt_`T:uuKE~xOߧWT겑g⋢e!],}wǑD-<1dy AqenCj~86J*]:NtNJ".Sh ~n #6ҝ.H-|TsVݍ_jϪ R8DA_)19}*A+.?>in. XS,]VLG=i>1|0)`rcEuUo9i>sa֙/8i0wtUٰ' ɏ`ʄ2cD\٥kOux*Ԯ ZyPzR}\='Sk=ЦL=IDvY]Dm on߂v͚w4:~;v{^q[UI] ^Kp|iW!cOwR5x;\C-7hW>i]O܀>*MuQi?n!/04ԜK,WJy%& x70 vƥ>*k`) p[uLm ؞aU8bp7tngs\X9.'$F chr`_t} %yz쯖#ʬm4͐XFd&#QHYu2LwZ D佖豉hj4BZ"ZvqO~7=+r߇'&H k.(7|vYR ˽)׆ucU8o08;26=V_X5R՟b45 wSy{}h-R܀+)&[)Q[M*Ffd j0DVt8]h59m8KٜkX7MIcÎGzJ >Gl*@ (LxdN0+D{-"#a:0- 9ڰK l}AY1 3 r_ J{iL`%XֱcZadPg($^2"YobA% 9Ij$fXб5YӲeM;@)t_+uUu[Ϻg^7ע9R-粃/׷$ =0,YϯZ8(RdKJO+Lc}jӮjӮkjSbT *DlB l e8ºd)Pʍ"65БaYqGkZ 'RHDc[#g;~ZBϨ.2%O6~xWmSbF{.`>G>h}0)vK -EǸ./q1tYbҥV%VЋ*~PNq;t k&&VQR1qV{tiZ+I%)yP^@ӥ>K氆t\\,76_ayޑ^ʛQC=W {0$_HVAeii%ZVH_rdVgUv({E}{O `vKx*mm<*#.O(Y~!ȠXRQ:Mi2׶8!KI ˿7ā)xg՟g#•藻Sq ~7~+qp@u?KDwðZw8~~/jډߏʹbN(e- zgG\ g_^Q}PA|l[ͣPk'y:-Mܛ(SlCcfuQ/U~]ZO^0an1aT xAOE].NQ$\+0Y:U4ypR/g"?RhY2ngi2f!G<)6bbh!dOPA0N()ya`"ZA۠FH"I`դ2`53-' F>$L'/n\ǰ0z 떁ߞ'4O!ЅuP+_W&$%˳wm+WOrӁ&뛍>sE UiNAQ&.)z':H!Ϥiɳ润^/O fԁQ&t <i{ czfP4ĩAXT\T"D)59bH%R aEFӖ5rvԳJ5upIPStiCq"SmgSiWa y%GD*Ц |zFR=3RSaXg(qP5ƽN뤩&5ZJОGiG`(7j f$Jd`sAGIHdъfciڪ,Ndb!Jܺ7$7no/kFU@AnLe.hBT;*XG\q;=koFE/w,g`p.np{AbF?mh%d{2 ߯%Yփh=c*]UPF/.Eȩ'3+\*_(VD"R @7;Q>FE x:;݌~P6 M8vdSCeXv -ErudfʵZp-)ҽ[R@"?7$ Mε$ :2S35K<6yOv&0AcJz]4^3⎘R#ZKcjnď]5+%fV5^]vnigh ,Н2dPjc{6)hGlKuzz"؂kܥ,wkؽf1`ȿ~1v>jƈ=5C8~!Tw3b3j<ȲL38sʑVäz;HW{(tD:* f1c)t|;݂g"ٍDQK&= >e8hG?l#;Zv ADA*<]8aQ« ܘ asIa8 Di(єM㏐]"aw?%ʾ0> Xe°{3`SLLa)B1y\P^'pCOn& yD T0pa#tX9)1t: r}\K k}?PV AHRVa 2LwZ D佖豉hj4"6"sNng}S$[j{/#eo٢fn7߾GxϠL6,aњKW{tޕ0uHDJv wA'_=?.v~;jpw0~RIQ(m7"뛏 {!oyXR!e?ᦛwn"4ҡ%#yzsM@ѕh֯_ڴni=/9wSL򻑟 Vhar!CvECl=PRvndzLS6dH"<:FQ0]bdg)ٛ!=%$\Ĕlb7mZ{`K8V`OL`,Ҡ8Hԃ_VRdA41",DDꥦ6 FрG!eLD:l4,:mK's8CeȽwݗ>H#j5#,*0wX)!SʽY#Z*"Ai\:pN:ƒfQM&?H9/{"*/*gPh7v/%* R% o>q <$dK /+zW%QO~Ob?G >H!d8n4&Z }V؊^I`[r]UzQb&SF680n:eZc2ʕe 9ݿ060µ]qִ!"˟%u#wK۰v8y*W~_jFWGe|Gx~aTZ6`FptH~B yݹ?q2DlcxNQLJLAlv_Gi-S>QX[@T}-O}؏hTK&؅՟kI{!ͲKWT{JWT{Y](5蔖 SțUR-Bˈ&D"?\8̕d-mqT/#RU\Bow3Gw7a~oӥ[ .y5] T= /‚&HJڗ"aԥ ˍCxfAqUZJvWijsn=c$w.Q(y$[o@nxTA9 )j>t%Qq\yxgqfD7W~&ř=KF.ʙ1ܙUc-!둹%&1W ]b]iEH + JksbsU’\As]ײ+@XqsЕ/ _JX^`^\aLU9sM_gѿ¹oW.> Ck-mJ,Ԏ(K /עBS$4ρoT0/\ I! EOVQ8PAA((PLè\zMR/xMjE4D.{( ͙+ - Jql{xEo (8i` 41KPtqBVxdɍUm&e*\'$m%K;O V & H  vCtO ĸQBNn;Z!ENקnVܶxy&S~0`! #[XcI"YƐ;4GݹS(}7;5ࡳ F )$WҥA&|U\, Bzr 8M(rneZG!P1rBxJ!RHD#|5:Ͽn[9|HN;=rmU}mvb{n;tn'8m=pm}N >IW) -c1kE\gx \P[]D‰uX Hg)y`79[PE4J}hm$}\(ņK!zƌf|s"baHVFVu<ꡁEci<:xK+oh2K^ބiy`#9ʯ|(l\>vW}xtOǎ;"½ҺbO3(}~Sx֯ikzz(+hRʑnI2kPLF8!f 7i''Q#'ᒿvKPR ]<ЗTӧLCEQҥwӛuziG1g 1%X{Ûۆ:0mL۩IFoo|e:D,S hv>~X>+7Y:u9_jZKFHLQ񟲉ӘCFjAU`(3*h2:ڀmT 5!Cwژ zbfk>@@Ö,I›]mQ̨af03j5̌fF 3fF 3QCҕ3H(֧N a{,q_Z&[%,Щ->HrE׈u|z'$)6IYlɞ ˃*( ot(O]Z)˛(9 LRx0G3] !vN!A 2(5“'V:d ႉh hmFH"I`ciMfX xL3:dC/s|>ǰwgBtҐ͖×7va=#<"${KA_ P~cH@GAUPPTITAh-p19䙔6:-4f>vק} L e(!Dr E&R=`P4T a%"MTR#*#+X 6Pd4ͬgig^ Ԭ!A/KMT; q\D^L>u䎥~hcdBrɑ'iȩ$n`,؀쁟EZF9RbIjDZCSk#\vZn2ICR^g|Ux@5+wC}|=orȱ/]е'n zv{W: mA3%!sqkR0ttE-SxOGC]> 9]x 5KY6{O ony=!z8Mc$oŻˆ ..xJ4ׯom>%[,>eH!B6!?SN%f/~l@[|X K,T>e9Vo'1CY[DH(򅖜N)bxiP퐧/,/EGdx>[cR;ĬW逰 9HysQ 9'>c%g`KPeF7GvCHuJ`Tj0H><Ϸ嘳 DJU0xa"82Nf̻T+-j} 37סjų h/51og_K Z84V:Yɿ$=EÅY2Nbu,o& R-hP9ӗ^Ȼ,f_ vpM*KFTAWPAЖm: q,B,ʒtG Œ'wvFw(ۣn ϧ|Ͽڕw`1l/sB@bg.>{G_5bVJ][^D3FH%Jfc}TwGspvpwTEyKmMI6V6ZeRqA$-Is-= |A.lk/E9w}CE}z$arCiY_\8^ eRXG!)\D AC2%SZd.%¢-diaEicL9)e6L)E6jԵȄTTb-loa_ s3 H6\&TF )D Yg͍6F4g (B<В888mvXSxmVLQk8<2U@$9 o$>}#FBk$FBk$9B#FBk$ɄHhTӜ    5Z#5Z#5Z#5Z#YQHhxD*;n343.Kn$c7LHP,Xa5 YD M*:KcE@eoܫ436Yy[5)~7i{>LbҠ Bw=h^BWDH:gtZUv5DB_F^ʗϊoeRTJ}m)j@8KT)T{2`ɖ"DN+'r+'r+'hnYevh\9Gd\9\9\9\9|s +' ,ZMT.e @~ @~ @~_0eC2"t`k^P/Z')Sv. i_f@5 ^ iYd&##Xd&N#N#Hfd&Hfd-$vl1Nﳻqa6:gI+?Vb+~NoNOfMޜ'?7#O>oG1;'9z3ۑ3_~ڊZn_ oGݳ||SD귌B({>=r'o_:]0\PjTsFqE"{C旷iT|$/U= uD]ˆAݎQ7J:=y jv47L//Y }?x_88<,Hq('er~u]- yť:kB|+38 &X>Wn|bl5/$T>itb1.r/ ƒ',JY!lFHR(ωB- p]hk0+e:2)#Oɂc\+5$JAl{هk-I>;TKO?%dݽ8L݌ 4-*U]Xz]_Q5Gm=<6>ܓ2M> ާ>x|0ٗ3hQYqou]Tuml] 22 LI'j~2ߤ]E4/8.) j kbNB)@i+*HnCʾdK--< %Vfnӱ&Sٰ$]MƟ> `,jWW_z yc3t#\)v?JhE=Q&hAG3B .bd4JjgB,(!T US+t_w:;εs,w% eFaK *j鑩Jd17#g=?jg>eyhÛ~y{,v=4jTUpf]Ȍ45.lх,, +|yT5ӇQ!}<*OGDZL=mO-9MxE]S"!Ub =;fZ e:ς@!VP I45 &j$dmمlc aDL@2k8l٠lЃqg6Ap|䥷 sti}烞|]WHez:~Y-\&WK Uh@)uTtV"V"YpNJVVd#K`L(Eیsq0+ "#,(6`Jr-tLsWRA#-gK#$jbq#gnV5LNlcR(,JƿY2dc>kFΆ|՗fFڨ9xiPd2K/uJ3%bI  d9bz9=qgaUd.%EB'cK&]TӘ|X8((hbnɔW*²I [y) Ū_>60 y*waCЛiU_,/MI7ۡ,KD?$P~M,cƳ?Z~h:Ba3ȝ((M(MPdk¼Aa#h@%]X;2\ uEd:8TX75kwRgr %6Gv|筭Y5޸vEo\It\|5Vw]{}o?"/j-[;lݏfWwy9]Cw-;l9ؼUUϷw>ޠ;-Wi՞7!cϡ5I;.tg+֜tSnMws&L?LQ^'zύ|=R3;4ǍrE5!:7$:titܕnv I"Mik}<"WDxP^ȃFCRFGK< VNɤNO  @/S)&ʴtYm46p63=(1(["ױ()TRq g)YK2Uh41d81` \MJGWpF+m+:+pե~Kv/v\u? 3KxJGe/n v'A \uiֳ.K+k2W]`UVm.HBnu^U OL7foL҇Eto_j%yj#[un6wo}J?؀ 4 ~1xJm 7Rrɰ??:XՂfYXf fqޝp}y~̿`~]H) UI ; rn6GKA: *Bh2ۊmEǶc[ѱVtl+:ۊmEǶc[ѱV7nAmEǶc[ѱVtl+:ۊ8ۊmEǤ%r?']Nj34Ȩبyϫ\U*)dƌp#z{2{EdTYeck%Dk+D&ȓ:,kJ^'rNަ * r6Akrp. Ћ>:甯R-鉇Z`iMU :L2s={4CF 5`Z*֪ 46EjѰ6Ĥ`Ue2LYT@笛K$,ާj(3 &Ά~6??7ɫ*> 0条ܨLr66K)Vǎnl]`HxE? pۿ1٨IoBI#|̊uLNeʤJ#j1Onv_\HMFS斝?tc\-bXk][#p []6Mvْ΀7 %rlRvc8iנb!UӜs+:_b 0zp+"mŏwt[o0η(uلl\}` `ErhT檵%`!ê621{,m,!{|=x|eoC9A=ʔHEAXlMfZA`dgZuqw^rޗRi00xm}\/S1gV-%{)rc܎i+gso@kC; -C C \ᛌCqYS2?9.~QF1=QiAgAˤ, Ć6kN'y~Rijn (]N.ʂ+ߏw['y剽\s+\ϓ.{%JfZ٥L*g3k-_9xNj*TG GHLeHՒ/}S>kJI^Y-|qYUݏWpv6B1ULܬEy &jlHeJYkU qjMzb0LVΗ^eGp,8Wpt&G:qsW}wE_O~?Vֳ7Y颀s1FEk:w}4loo/J ~!<9;YoM{oe+F^| e05 F8.{❹2:K%8&HhBj jnv,f3uS(hP|fE::>+ pB+r=Բp8us3|b|ɷL /.U$oD 5j+Mr4F;i͔TI`L;wKT>h,>>Y}?oD3k`nn}WG}:vRx2G"] qmElr[{mɩHX*.RbJ+PE 2,Tt.Fѵ(ۘ?y׷;)[9 f}?Pl˵QAP3 3-]ԱQRdsP^+ HCJ;Qt7J Պ ػr>9::?ZgK9K9#mUxW*0-$6-&BΌilZ{kl.$ͭFg\ -sa{X1wZeIeUmBD*Ÿwp qXm`M:HGRj* 7[D`/rDë5Ch㞽˄$u!8xX! -Z臭 =A(9^ep/!P2輤dCg_pas\+^SMz]bs 8G$GU0LjZP)jRY=@۴wZsPA1DQjp9J##k]k 72uNspC ;rŸWfи{!X*Eejj+q ڈW-6%dE 3qb+"Za  TT!ё`*KEt)I5ViܻS0ΠVv7 H9b SV4XT`t[9 \ ~*7\?g dxP30QmŁq|`<q, jdip)e :j-:kThs]VTR[}.*6aNy19FPf=a)-%lCE Ȓqi%e#ׂ2Bآ5IX\AO-*.7W syz4,:efHfu  g Eߴ* ), x$AT[N`9rVL @ћ hW0Nw|R"6!SAp`\ !,k@ C5DB PeEЬ6o˚J"鄼5%l8"K_bM J/ػF$+.*i`vƬ2?ZtIZݽE"))%v7Y$2"NdcG@ 1W009qLDcBNJK5%ǼPJ %[ HEyF-ly\Frᓐr1bhcbjIc-05_ {Ԟ(m 19-\_~w %jYU)$8&mȽx-=j\'knGVyicˌey!ƛR"@hS9Fa XKT~ck m:Xb0²tf@h =JPz`JnC1qlS,TX-Mn]tY':8boE\jl6lȎhR2.BEC](x餀[`4b(eUU"pw"UF5`j?L/z~2o!X8w$YW(\%hk/+OyMi,U !/W]Ԥek};yZڨQ-z_,J:VVKQc9-Nq&t"~6p~89'w,Цsog 1neF7,Ay,YNIz@zVWݎ݌b}tD۫4q]&". wq}H{gxr;af۷8Z󀾛o|eqz30MƑ<5!ka]*ßz}];F,#!9I)A{OA߆wapX?mεpoޭ?Ɵ{gF9|F~y:~o-IIy<|PY⍝na|\z٫DɩtQOJa7m}zyl>9ESr)mA_z5ͫWBRIIq&-r+۔o0 ^لea2l\qx;_zw祋mϙ _٤0V~8pcW{ Üwl1M̬GZv_S*m$lD tBk9r?\N_UZo91\fSғۍoHO=3sx^bI>=:!@6BặÇUܝFQbOv{ƃ9ܳ[d:`p. 4vC^̶i4|Dybuwl ޟڱxk%EĜvY7+_yu>ښ1i`љڷyTDY&oxiZŽenb2d/֬Ĺm͊EӇ\-V]ʝ_[OYyr"-ݳl<~ Bm>ݫ1'] lP+6/k6[Mf?ٹQǽ3^'Xãu0[[",jZRE C2F>Xf%/ 44Wp=]][UvI+,C{{Ǒ P9                       w+Ai lfs>ƸZO1n2j2j)Eˇ/TnÆJjY^EeqD|L/2 )|۳:>n;{^]F }HO @LL%`Th%+#}lVw(A`X2`:y^Ҭ!Q±P Ս( 6v2ި=NjIO>up^Gzk2}&ˊr^LRB˓*mYԸƇhlidM[yw҉ay7]~.F:3C3lԘH]ůox֭!2&|,x~)P jf1_C^?nYv(s0u4Z\9lY~5F8{w'qw=wqP |<TKqɅ+eu ѥOq]qQl|lʝ' O?O~M>e(P*JŠƆLԭ&%Z5KN:r(,q( ,f~!w!?r> |"s_^OFHV>#uA(sQ6|ֵ7vZëPq2 mҴb_da:{<{K;oaf^9葱ƻЊ[jko?$HHīAdV';K~e BC>@CM"R#JV^z[,3:BC/@Czn6ί7.I-ONz@<{%KJhiM \f«SFpI(bOm#"s/Q[k4oc/EݽحPLZ2Ĵ\>l;gWkܲ޳x\J;(vQާA'hv{ƃ9ˉ9!fUOc؇>\{gxr;|+Wv<ؐ?†w?n8ngɰ0~mt3eE^ 13_{zJo,-\3dxzǁz`iφ10Jk1ӆsf+̹8p\UVSWJ)\peԜQY|U6|5?pdp|6WO2ai ~p4k;NzpG߈lm&\ܦ>pm̩l%pQ)?L2߯_?3;I1ۿ7v[]jՍ#W_b؈P7(njfnz'qގl'7xj08=:.ԎRO*\?pf0f~C}Bjs*lVD'KVQ6L(Ut% c 7鏊y}J`n"<:FQ8޹#] 7c7HpGXT뒀d5[E`K8V`OL`,Ҡ8H >"]I*ԣ^.g'R["ٮJ_? *`P2I?{V3¢ ,\䔀Cq@kDKE ( K"IgB83`' 8 hЭyօlU&؞2!f|ĵ<яӿ;S5iz&ѥO8:B \!NP< JR2z9E( n&nt}=k+&7\ &yOyT'w2WfܛO)kJ^U_?n¿^WÖ^Xp*yʢVro^Vנ0c<,[vcNfn7V= vق2n p;ϧ<5Ki#֧sw䔖,6)* # [-NcsJr(C-OmvrDI(.&jSk19|T%-FeDtfDk+s3)mtZ̹qnZͤ_g>; L Ft 16:Jx'E<I0rIȈ&*HΑC N 6Pd4:%|Wf%.7*<~ ^jaPk`"`<1* E`?8F))ד4dGDAA~fd3!-ăXg(qϱͪVg wKQQNF(f$Jd`sAGIH䃣":ӏ8Z56b:pfQqI M( a#V눋3.Zx3un&~|.`|Ȯ';f:Z+fR @7;Q>FE x6 {,ƷQȮܡ:7$Xەr{w&!E9| !"w뀄< T&u Z~ Un׎Ph *| ]J"HV$ߣ} h~<8Z'CADB:(rF'vZlCDQoe0 6S <+3E cWrE6sF+-MlI+8˜ZzwT͇'Z=}ݪ5Aj[] ]0|tӛ4i|S\אp"_J,ìpT$ZGml8k^AZ/ kU)4Y-:MC#矣KSLCo{E}po4/']YI/돯/>{Ƿ/~xxw boY|g{߽V]W*+gA潲%7!p컅l-z?vnuLZJyoH3f{ l~=VП\5Z0Xz?1xKχSK$k_ά`&t* 4h#U^}ccD zkZ}y5C_4]}%?T`#0bGJQ&6:'fzh4@?1l-Ne0EaF, H ,R\ @h ٨{jH-0Mhu2+*B22n`QkG)\p`dҫ u@ΗP\BUrg4+ruUxWVoE(eEo|/. a /24wF8sc PH8lHD'"VKL v)=܁L붊6X B*Caj3j(jye4zl5ͭHY눝_Clr[;a?"a?I0ֻ~ֻ[L!ڐ{ u(L *1eZ z4s6:-AT[MR6Ofgul ÃVV.`t4<=s82b,'kW~M}0r=̼Z}#=LF O7{knCal\uބyY_vӸ78*Y/!rS~O¸:jels>7C@qߕ'iL)1X1 ^{ɱ$Q -9+R 3<[k,чj`负|ƻ9r ={P0v,n Z{1`|b1VGmQTs/ w竰 |yKY˩0`\_ 8~\sri{pep`_li\|E/n: kQ 1:FfT*&JzyeV#RoCXzuI}XFkI@Y7&U(*S8ㅉ. fȫy޳gTIl OރjOW1y X YB\hKddFxƹ ͥ |5)I7FU[MSߤ,;NB!d)Jb"vv\W4waC&o_Rt:!r;荩]90ӉS}6۟Xk >e>@a0_ z0>@߿ڕN_K6{l-+Pa!!-UT9Һ_ .;A ~P@&$ clBY= 8RҨ}V!]>n7agv>o*!Wi%#nb0/i,K3L,1ED}LgO:@ï6UE1KěΪ]etN6&<ߖ=[:j6ԋuM}&&NTb,H__ FszR0Ib k7Iqx}5jm#jVh_ūRu8M*OQE4h#s(lUZUIy JINX,.qTЅKFKKeFkόc9Jdax@JhMIa0+c*i!BJQ:"JI2p!k!Yk1 FFeGkӹAZGd[X@oQU}[><8Χ0qt^艬':m)R5)l鈙7A?Ca{kP W?*̫%2 Lv𳃟q| F,0&ךaHHl~مۏ|nTTW_[qhR^m];I*n??9edARZ^'$\H4JjIgB,(!T}UF;9}mr4.Y(lNB-=1UF[֦s?߉_ ܠA.=ZuQwf,YzC4 RP)x,{_+|^~!yz?ּJ6:I@5.\ 3jY䞟L4x.nit +R2;+eBԂ~!Nv1zGz+dZVjx}N{/kڊum;io{`j֢m:z_g1$g~z;B>X]MkU:o!B|w uw }}ɽHcTjPcr/ .̃Mt2+tLgiŞsS 9G< 7.i§S ʿL.[;>t4Rf_._MTUV\&oZ@C*'(ӫ!WUAWeJ8LzRQ+ M(F֖(#w<"D<3W"ǸV9kd]J;*) 4'Ej>V~LW6rW0Zy~,1m[avZmhy@cf ȝ^v3L=sE6" |F*̰dgZ*åw1J ƒ˾ =t>:Spnzor7=fzv ou{\;ߦq] [;:i5RZ/IvFt:of!嬦ݮ[=/4tZziMh~n+97PN-\ZuQ,#gM\xxf\uK"Qknn~ ݜKкn6qJ̍*nLUqcS);RŮl0ࡴtUf;J~8^mm:69yREB*dV{)R2Q~l+;֎e|nֲ.?^Tc'F>'D{D z O3ҡQ!5_vyڌdMHMYWr9JxWD5uwM2}Fofd4+mJ=t}$/wkm:YWՄ:|P>xiP酒6$̢K\:c PR:`9Zz3EbQd v,& yD??tMo9,` ]!s1()"H<Í@/t822LyVkP NO9 J(U6)F"[:FJ8tjf'1h)G VQZo.C&tR)e%It03arBrcG 6NGtw#BsJϟ%ot,}M1R - >{MD%abI9~q11RSMkpT#X3׿;3cr)26w 'S&(m|Cى2LN /qr5#, 6%'C{%(]ǓјIjl,yWхR9>Ʀ:Ӎ]/N5xsV^ka_w_CS oM{|]"fi_-NO_UDP}R7jECkiw0_U㚄9tu6em@_V4mݛ4/f XߟipTUdv q=cگ/Ui1dRjDw~W?o`:eF+UaQBʊˊ>{vy\Y6A.WzWkfg1{r063~B3wQ?s}Yv+|+jdGieз@ڏs\GCv@'|σs2my~vaNo2k_&oZ`M qC 40b h^MP3ę8,Yֳ^ڤn/* FX-hOE垿Z#Z8be0tSgkjt<*;M!E.OG.H3Ysų.!WWG",kS|߱L4X1uګU&vR}yIqYAs6܉xA?pjua2$4~ fc[4_315rЪ-5z!|qبpcs Z{^gL ryq6ZᘰWQ)Mt\"ǡ_s:no\bgTZR֙>b}jgnCj * (oʊˎᄷF߁\vJi,Be"R zEda$ʥF`sdr L S7^-8ql^ߣtXuE|cg^P+"PG^D`X㹓KUsW526*+GS%m,>H_ocUKBƄlWW7u8Y~㲲-֑,-FԬc֗!W0vجՕ2KvG?1m%Wδ"Akd>be]yg {DxuxǷ3VcGt0V_xƴ(Ebh18o3 RHF%S69kEZ*do)[kg( >6/ X41AH6I E DHG͟BS(z@)Ey7rh}Vʝdj''պ7>9)4QWwf-1jȓ&6\dݱe$ʛ h< Zs#C6Ғ88{tRjT"5U+WdZsD J@#1!D*$P:c)U1H*TMR}#colFVqJ/Bz" E^DNorNd{}_ـoA=} Pthh:aCGb,z(h9)W'i pg6T^3mBc"^7r6#’sy2l_q*M j JB*yE:G0"ED.:ĺD ݨcbY/I+2XƌHE$AKY\p^CQ uXϜ>dcHZ"iņdYhQ}9h`EJ V:R]VTBY>peEHZWDT.j[9 .*NKI}Bfً^rC |Exmx)N!rɨSUb˜ȝu dV 2rʁ>29[g\ic-س':9/.|eנ.ZHq$|z36q/n9S9HpG/٣W׃q9kD(Z^9l%)a ?E#B1qtl(#!9Y^Y[^=Iq-y5o. 2&l8?7."0k45]^ M G 6GkZ4u1mS -H^ވC>M4t7-yF 6;8G^3moHBCͶy2Q Ǔ@ĶZ \6?6e>򐏗⑿ b3~O"1}p#-NN~r?G-R;OۏJ)EC$)wp#6r֨G,\j3֟M5sIh;K=vc6ɋia E&hEpP͆35l_l3~\ן/f*vb]$ z ƋdAjv$'oW,^t0_vZvKk$ԑ/8Q3n踰H෋(v.a/˴PsYiaY^_`Wz-K5m0?=y {BJ9T?5J!8懋]sjeoT&zd|Ǐ"Ei4U*TN8S%&r< s*Xe;ŽA>wۢpsݦ~oVffrMPz"P%Z3+" {d3h2G5\<S~MrIx4FQ0 œ \er, (W%չDNS) WpnT2qZpw+^ꥯ*2F \!sLvRW&3L:*^pRJO2 \q,;2y.p*S;+ʯ#x|s~o2gWW^MNTod>Os%ԇKⓗ2*PK+QUPY.q*AL)$. 7{gOù*Q9(8pPJy Tzc=P['@4g$KwoexĿ,Ax tfXJx x8UɛI)rWyl%ʻĨ B1B7ީ\-M5O7_@;Yf&);8x$N>Ghٱwn6!!%7mtRZ}M{xҵtmصqJ'ym4vҒ. I 2{&XJVC9L"_£ˋe%d FSbFK_Y+ld\ *|isϾ/tfqʃ&>~X Y_45qW 7b`Q[h2^ pM} %p) bTmAL\xY`jDܟ1[GٕuE3aj?LTA =yzsi믧f ˊ9I9Xe]?n/* kbt~w5afNF ,_5#_-AN"!]$,YxY 퐫# VW"Oxb`mWj'MDR6n380Dm,U=㴠)xA Xwiua2$Z]u0Cu3Zk&F.t`ZŽVTi:xt7]+bvIj%vLP;j3{oh o{QDϚYg8kn4W K;JK:чuRZ_"fDWC#f8Nx{Z8am,%gv>> }a9C8=XB$DN5@A{3b\mK U/M2|يKzxDD$xMnԑz/գtXϳ2|G%mؙ1)d%38ABHi\K4. K4. K 7mznofqŹyeůηTUKRKgRnjŊ_əZV:hٻSl.ZYe2o; ~{{DxuxG <:*3zZ+(̱xƴ(Ebh18o3 RHF%S69kEZ*do%)YoQ~:}g1`5q E#h.c$r:G!!ڔ[@hJqnD 0b"EBbX4 2O}rNѾ ,:;zNj''_w97u,Z~}sp'MCf%7`4Wj9kwz<=.szE[*OAV^kndHZZ'gImPPF|T98^QjBH-,*PZCAV R6!H[)G8XHO$h`,m^@7lyݗk?ݓ?Ppm8o-͵Q' Q<1V`H\B_Eo:'V\8 ,&Kpb= Rh"|L+3bFfvQXr.OƂ;NiAbUIHeӂ:hQ@B$^ȴh!|Bxw5'g$Udc3"#pjT,eAry QF*(jA`6R͟vm 61J[Ĭ`#+IJ4;XX8xrzځW Jiͅ"X+"*OT{=z+M( BQ<*Ą1; A$dzkF= e/Eq@Iiьv.vS잞5-iɚnYTW$*thwH_f$8v9G.ĦA)Dk6& )$|LO3\vw[c a7#T06hqB-qiG@B<|T$ȧe[zm'hzU9^X/w..*58?'R(?B8K. BDNJx0$hB6׍x̵Cv2a[տ2rx-m{LPH jBP8H%Z.\f< S{'4ih|04dd.(Lkm:%M$A#5N<NoJC:_dZ]E9.Lq/^K~O V=y&tYpyq"XBI8+C)LN+*,L y)vxC`~ }}n*D*\KBǣ?| o\sJb8iRSSLPCg""r,1*\&.F \h8H3xĖQȤՔ! ז`Iq]8̾ʔ DZ]O͋y}&2^id8>2-Y~|U[\8VDȌ $ss&\BEGNEHu|Pp>@m2ڵ Yј<ІaPCZ4ެєj萦CIi KZPp(WH^p3L!PD 8hCe"SRJB)$e4" R,Ti'Fu%p(z#͌NEXTwSD"-#=$qgsB̤91%~r|7EſG@7fPBh}c oi?Z`Nću78y-BC('k#M3JqhŻHt:!ѻH}-Z-kd-biY&Ra9:?]p(~^}ŕ8{1Jͧ-''fW\J\BpΗiaRdS]]Xvm=G}COV& bOE]6fn7gV`mӸ/@7nm60ټC:)0m9ylT`` 3ZDA-vb -sfM/~_JlD=h('&𨹮~1 *0KH썺iAYCrMub6E!Gj9(t8s 8I*LHB5\Ciͯ&c_WkA6/iY߃B=(//O[o]tsu[=}+}'*Eb vq `4)BJA-}"@ taNʽr|!rLDFaB@ TBRkkh:A$r޻#꜓3*!jc ^P8-Z"!X(OJgc9;>3_A0S}yMTՙo/=U7Zx窻:"r+i1|-\B۫ε~4c$ eK,]?.۫MtY;h}4 d%Wmxoռ*|勈K;ԼVrz~5oޭ˚){.*⅛*0#G{^luCj|(2O4ğ%%]e3K;+Z':62 @)RKCl(WzO$d6;8#HFSyՓ^E΄Z 7*e.'ۢM"S#&lƀxTjUzep QN8e%#1(:6;#g|\>C_^s֟VkՐm:3K+Ho_0\g-0ux7 ǫτW>Y@K_\><6IRPJ*yQ[ޕCAx/9来pDZ+Yy4R +u˃"X!4>|dW-}?@ЄL$C Z4TA9Ѣ\kjT8Zj@ |4 ˿om ۆ{뙌.caj3tdQiv.u9z є $xo=-;3fQ1v8ՃhCXTPLh.L: H$(,!*&GIDW@l𒳀6 ZSq-H (>{{Ԃٰz; `LIPP$jԃN CZ3Y I3SpDpqYgk+v.j:JO !FFHĤVG 2N!hbڛA RH\*-IOoz(y(鎙oi  @TVq!p"`:$$qiXǹ1\*ZW{=`ll~ y(w7AӬX }ɱHYY:Q* /4&t%$W&kOgy0s|D389x OEjߧG\->y b~~ӏPHdl=Տ:Ouc褾rr>R჌Dx>Jỻ8tQtBI{Zg׃on ,nկmi/św`4LT󶸪^wfhZ lx08+ ^eoUWcquIdi3,P_ׄ(!-zjTHguVk !@0up-)x_&Y)8 R` sd]GfuGnK]Ffv(/7麰|#qc("|0Hÿ[1TȜ7\EF.3mylrW0T"P"@3J+ (6\9Mmhybfl:_4%q$%>uQ_ٍf%uO655ZrJ{ y,@͈sH54Z0 ?Vk#;SF.2Ss~v!KX(!јOpNQ}WfU"t0,,ope60iqdv0_LAW  r~4=\G ;$!H}KP,0^dp5km?ٸ][or+l[$<<$Ez HbH޵fIhG3ۀ-._:m)J;{}aJ?~9? 2 /fѕF΁!1%0+I%MX2udnZ 䮁gi*8oQgzyNwf%Gl_/Qc^|ui\|0i~~2. l{{,s40gw2B k4g6 9{ۺ|;0QmWB&+ͧiKƳm{fӋIϡ!y;Kࣧ{ܬGnT|]^|h]Nw,r3-|c+ ny^ݎt&&v\cknVs:3[Ut&i-<[[u/T=ɈwnCÁ@qՠawi`M Vn QiʅcI;e+f&ZJX56ʢ3Ϟٮ0dDM7Ձ}ث=#l:x_ud ~Ou/0OȮUEpK!DDEC*>"zJ"p*=Wm ѥ  X 튫V,lJ$P.Ĺ/κ ,8w5awf z7=c ,]%z u 8*&V+%P j)7ebQ 2&hM'̬1mxI4 []'NZkE׫+E_>nL35|o2ޔRBhc aBc)@zlхWϳk a0},Y;wPvn잟i3]W)]溎6Vg%{6WY#Qڪd+1Q\v w5皒%xFq=={r$Y+j UToM͞a_M3 p{u{gsVTyf=C<;tz=%u"KH<@dtT}A;bp@1HAuhzlPQnCb6 R2`b $S>`KU|gM;]^ ^q(^[^{DiL.|@)%r?<}v-ɳCq1[n|||-1Q3CV#hhN)Pju=W  1u`$}M1!D `%jɪgkډUeջ%8-^ xryyUd_XLJ*s)楺1Sfi[ʳew" r>V׻TR7a@qC'ÜprwISGʷäK$rt=R+ԚǑEh(Y,lb LA"F窔nw|ȕdYɳt`ھTr  7:x¬ 4XX[1e]3ycsN(KeǩxGbruP ģ n7ͻVQy:=&]{J'] s SF@GV VV_rJ!Q+?ʘɨ%Qh9DMY)¤IYe ^RJmd"MW@\rlsp,ZI|,T0ƙ\11Zr{wg j񡇔AԱėZ֖v5AO!\w|㣭^z|E>Kәm7Z>8Չ"C"ҶJ bMm$bj5!X]B-MfѹT98-j&mޫΎ-F+mE`pHd/5CѶTrE9VQ΀DbEئ8e*`|q.(̦GMT κ+0^<9li @%'P\UEC:c@mE@UAZE*CkQʦy4&_'Qo:J pVrYU br"0ٴig8YiZݠ_yҘ'_az 8%&v 845k09v&13Gqn:_ fM.4 ^P@I0ihsJ0Wu5.p̈́dsf[ٮ'E~|An*XsoY 5`G\lp>&QP8TU\ bNXIT#DKH}2)c5k<:|g?"VgiavȰd#(3 ъO cɾMr_+eO=w8q;z_/'= b=7~3/i8*T6g9|~Z N 6z:<{@vI4)~-*Wb3$'ڷI8/:Y%ΪUly7fyӢC6HeCC /R3'š J;7yu[~Zrʾ܄ 7,zw|?{~1-`{qt|/8 ^ObFr$ oD>Kd]arIh=qˆCu܋?D HګcAqtJSҩTE r* A mX7jKq֕e4+617\+ƨ !$DOTl.#etAe콘K-;Vzz^˓5DHFP!XS5r!QfQCKN)%5yetT$DˆϮmlXM`3gq2:okX}@N,DXffJƃ{qP'nk\zXQO5g$`fe8@Ig1\wm%ȞLqT\i+C*&zcc9q6hEI,H({O+V ,3*FCkHyUzUc"L~˻(2=ϛo@{fkOU ]?/nh-x}O|ivt'kӶot( ʼޞ8GmNy:vz%^m9 b򧷪DG]9֛JEHrq)>c8=rq;qsG9[?|>gKb?/[o޽;:p|s `5Qwi/&8gʚȭ_QKXT3Iդr+T^ E["="?&)$v?Ȗ p85tjmJ~֧YirM6g>yhaB[i07ӱv9+'78Ն|?irvޤZ%t omFrdy&W1 G,X|6ɆIh[wՖIaK(4_#aP~?NW|UnԞgj4>t++U8'3l0> w8ۯ/__~*/~+ͫ{/?6'I %O`}A?6M ;4 *z6໴kv݇G[n ,f ?a?<7hn˥rD:((aظo5UUTjv_5easf9665y?v.SElHtyc6PQW@d_PbF܁ဿjs?#O&=2vjA[;7u4z7_9ۤα<+HEi',5"iaLH  \9ghIݩOGAVvЙv- u>1+7a! K%S5h>pΒжr )js+q*MAuzlg1c~OCT1A,F03DfjD:OԧlJC9SQAFXf bygDA>fDFP7b-2bJfnٔ&| {eՙY#!gpr9haAqw]75=^*\Pab4u )mvK]xG&M.KpEԀ^ R[3~bOV"?WJ4MKi2-zxK2c(lͦy\lݽMe㥯ռKy|(?|z.s~4̻ .=_R,y2]ᙅvԜw^tqeNi͵7"yT;4V[+yWGH: 9()A6'T3-iIH6I9.:DE'H:ʹ {قW9x7[%LE$IPttƳ|d2L*3υd)+9MQ\Wg;I.cH+wkx3X+MQpH/0ռQC 0u1Z`:iBpBq^"b]5RԺ5Մ_=cD(M~y6SY&KKj41ŏ<꧵vNZc΄*+Oɇ/ (e$_ QrxQr%+czptnpӘA)}0< 8Y BU͜ S*#! b2, `JϨR\I]B<쀹+jlGFWZR ?hcTh%[?}Py/7L!psʄ$JQW7,Ǎ^ ⭲xg+!44ppȖ;28YCk,|3MIQόnpH6Mm#wJVt$R[-F,HjFڡԌckVQm)$u3iRrP7,uȒƘDکKɸa]4K˩Td40Ϭ1&djm^K54YBX7|IrjdxA.2:! fS? ׸Hѥѐ/P]2g$HnI9s$dF*=gG'% U^>5xJ,0 r-))ECBY~r^D@\̐`䲈 *%1"1 %D2&ΎQy Ͻ|Ln:ݸanKH)?r9%c!H8kYp*1/GoSZ-8zGC BV]2;eLYKI3fe1RˠEWM+Qh[Ů=nXSLMyE,,T㛮Mw u6t"lzf#`{6yI8I8I8IV[QXʑ4]Tf7:Zu(fRO쫜B]:3@k6vqflw `?8ػ%3? LЇi}p;n"}|4s$Fz@Z:U+!$1KoI沸OsWĺmut ]R5)Z)CUYb53"KKk9k{8㾂Tm.{w4Y9 j<'kϻ"n6l6;=;ԳF JWbšKIWtqSx.LeF9]\8w8pE)L%сN\9WoYorVӨlxʻ4Ȩ2 ŵ(!5glfP#}eU&<)GP=v6aSgm[;W]/RȍGm;WxKo?!J~y n_Jip[*]|t3GK,91{A+j^)y>O7,˜)9UA=blϋn7^llY椎\ޠ,!`%(8>Up$PJ BeDP*[4$H$]tfQI"$f=rl+d˛ ^&J@P"'BQx$;n#3%U3%#fUy.dtǒ'@r-ӢvZv䣱g> VcV>7'<%L5o0L]̳` ǫg«rIpMFd欈B=/5v6`!;l[62Y?>*!MUT5z _54Jճ%sa|zeۤ!6lz"~ɯŃ bҩ91koR}5 `iQŹ'Iڞ_E]7Ndm76L3Bh YLu8&zrg7]38G^AZ3V&yD(TIs`.5}9 0f)D}p6ec. fx`8s9^=}寯ow>uo}jipR- jk~ߊζ F׿xsU[]CUT]/ȂF]Nx~eC&|8Z칹n-D7qC[/YlǤ]+9K=W?I&VR丑z|Ȋ0xCݩIphYA˛uqWi7V.[:hGn|i65@0ƌ ji}~6 ?12K~/>G܃ǎIMfrz8 3) 0 Lވ)$T)C#&#'CswtR{xMh~Lk+H]QI26`Mq+\;JY#%^#*u &^Wn'yMlҠ)~ۀy{9o%nLqMf 4x:,s,G9,q3|%gS] 3+0Ʒph?S">`k 8>;ĬW0KZ{1HXe[ ÷\ /+оFWIuO&!\R}v Xu0J}Db^Hwa}vIw~,Tt<-T9O/yr&+*Ӎs]!eY̐{7o6a՞m>ó/sf'26h<=zM_@:9J$ [~sKM݊}=[S2|aƽѴzonc/6<.m}W{8ԖӃ̂,pf/q%T8 y Қ)RRJnbi1g;Ja8n q}!ʟ'`f8խЂ77PO< jf\f0=B@R3w.K$"X.D"B+[.3s4;"xt5r_nQ֎ͦUxz~В. wk%+LMS@K{э{G%4pWۂc B 8v @Y `p09sH[ ̉ʠZepʻK9_YjKisJ͙"z?("U*b v"epgFLy o ˃w4 z.?NY*r&bz>G=Jđ\a;{Z* T'XHӟ5: NN9t%+C#6΂;zGPs5[pUKDBQB^۳T)JSM4U^xQ%/y$")a" P&r8eV}vD]("PV'%(%vd[;c)r[&yDQrux֎,ڹO5kXv ֤ Jt!|չa {-R:KE5Nsgjkl߁֬}(C^=-7yu-:pBWkPqq)cLjٔySƖ֧V )6w,4цc&C"e!r,Wx?Ɯ.y+OcC(s|x4*8Ugh'*07R95_8ص3>MhIp%`qkv}7>jS{6TDf" =T~@Rr;̾Cv}0n{>y7j`ݨ\(OP~`(.=YW@`U"WCW"2{qWP\$`FD.;*Q{_[wRN\=BqEH+|@ J҃WZUrB'bL=i|'ŮXJҽWJ;qŕT^1Yb&eE^b*ZqB*ep )ͣ5g.g>yWhd ߞ=a!:x:aCQXaEBѐh- F$Iޞm fWfڥwHeom/av6{9`;l/ d5#K${_bIVKm$3"Sb!{EnJ,L-*S!l{ɍYŎ_F 2NYGJ[d$3VكVúI\P/ ' A}0"Up#IJ*t$Q Voۂw=6(V%+*6##p"`:$$5\Ә:('K0V\*to z7+_ɘ6&ռ-s IY6@"ǟ,`(*J=R&`Ǘ @2T(S |ׂT)C!(Nz]u8 >n| &oj=@8( pNӄ$hKS$9-o[,\=]]6n;WB8VDȌ $K# 9 ! vCH:~P.Wu@3UUdZK@hô‰d@h4FQ"C2Kjv X32n]̧` 뚰a#SV"l4f"SRJJy )k$ O!tZ[S+Au%I}#͌NEXTwrD ,|LʖOj⟿NIٟxYYmݴN ;H74u$!t4 kFβofY~^Q4p*u4<,z8f?19)'Qla&&娓|5.d$,}>Cn{.KzA{=_~Tv'5?=AO?ۏ?~ˏ̇}6'\iwb"xO?=chSjho147+ f\]r >I~roz?n֟?~XK1qD3QPV (7SI]l-ij[vgopq&=0!Ƶ?;muz)71wH( &W8y\wO>{3 ?M<4b'p ڲ?7:nr)mD=e4EyՎ[bx\sc FAT B`o,y 3^;2Lub8ix8CEJ &$ &j1!TCW._I!GQo>jc9j Noũd|صp>( 0B!V>YiP0ǻ]Z.-QU,q Kq\-dP KA,)gp@ؔ B&CBd$hQb1%3xCm }罥B PK$)PYU]##3_CtP;a?ɵ`4Y?o2|E4oL^'Fg~9ft؅[w=}6D$m2Oխ@hw׹Ofrd2k.k+l]?fww9tf䝫Yw-+lY͹}IϏw>qѢ畖!gvmotqmx~4L휂f_<1c*#\G5_4m-ۼfbk%}ļ3P]:D/Rjv8~;W%{ t BJ+ r\%=Ւ@gHm vvF:?##. "(h'+B o>xnUH]@g6i⊈FM@#4R D <pJFbP<6ҋHs3ЗaWqjx7X VPpHa``:y1^=^*fWj&{LobjVsԺ VLURzS7*UG2}Y'+AelJRKF(j#o$>7Md`h9来pDZ+B$*:ˆAk tW(Mk]n}&S,o1޹gG{+H**H3%20"*Σp  >B;,"gm\'~ 'ڄ;щ|Dm]J%Z/,&t _(d $sM%6;j<36fqz!˿:TUVJ5ٓPס7&>bv\( l@@N^p|}[jyg bl7 .>ϩm.L[G(6>ۿ6=NeS߿\^T/_Nn:9KZ$ c"F&] Zs&Q8S1OR%'VBn Ȧw.=-Ouj4S#a>*'wr#`/N2Hf'`N07,NBĂ + ߰LԼ15tݫĶ}D=kK- ~?#DmK,F+wZ܆ tlqLale ^zĽ'z.<܎s۾gyFEjEeg&y9*e2cy=C .;ef(N:u(bMcB΁ՠIwp<-8)-5N(h95"*$ k,_1rT4h L9㍃#_ܾ1a몳Na)]iB5֔_wWC\>L̼J(!j:!%"'Z aNFC-!mYsmV ]~Ȳb"o!;R7R c:Z0 Axɨ:?PmBQkr ,7/t7B7ez7vy7?\zE~w>]ӫ ? G p*ӾɊ5M=>j!, gKZblLVmRR\hj+fJ_WAp5-FĄbe1DV>Wy%$.jN/R|b>^!(e̋gV>SH!&:Ĥ{E$XtRrN^He ✘ 録2{$+jWCZn7 R:;tt"tĵ"hh\nGwQV(Gs*E S2tGG)pM< I(eY=^h 1nr vE Q'GEC'AKQ,ikcW3>}ev +Bpl~~ iZ\-.9$c7|OeGcGCrfr8P\Lt_B 9rºGe녷b{ W|>Zm@75oɾQk3ޥYGtFp4Y# xcn90qee州z~ggٮ?r_.=z,EF0{B =LRDjf$U3I*EIk$e+gSq1u6L*:T aIgӰhV ̾7 ꋊnޢԭR#1y<ڤS* lܥp"H\d3;ɮ&]-ׯàOKBY"zBޓ ԀKҁ-AAiH>D"2(ɹPq9&DȊb9&$7^*cȹ[DӻIFFwH/~R.p5>9 ro䪽^NOē֖Y[坡ښko8S ^U+AaCIAk1KZ5 :K!ۤG &H#VՌښcYiD zZIg6g&E͵+TmX5v(g Ku1fiLu_YD\jshp80<ξp- r2VHN)r,@zFql1 $Z_7m2 C|L (jSjh#@(t Θ2Cc+kjܭxW˃AP6v`q<O29+4:P&@"(1cYȄ 95P&C&&Z$aQp$E|YEdT'}e}Xak/Zb<X?ՈFF5[: N`}F2MϤa6YCR x'aF29)jDg9cI.ke4 r3!t҄J.$KZE@íDSH/Ni^^V$F |$@ }"&l !e/ ŇЋkqǡCO> #]C]# |oshw뽷{ kl{rOIo^KG{ ) *7dF#ʐN2(3}.{.DhZ*OmpɃ"xǃFCTF9h/ )QLɨY p3c<'DD4H[@0sBdLF8d*jxKLf#D8j<˖n2T`ʁ@iYpI߀"Lʅ2< l (0GJGzi#B]NRSrADLj,·pu AL1d+.db%k R%V1v ^<#K0vPAqȾa+K7A/eaE?|ǁ4mfq GiDq>Y8$&lzK ˏP0dNlI Xo_ԽqWx]m4uڧ W# G2mY ~rxH)&cH.K:ZAV*%iqeL9B( YzTI$3hѱX o5rvڰ/RÓ;0-7^iGT^20D2$a"@,$sLic1Av2G_uzrZbR ,uDat(Mdo9(Q䉦*P)AܧcPĸ(u=Zp a݅1rFK Dd%dD嬵SZ"Z!0kkH7QW#s-M; R - >{ML%abI9YI\/HuijsxcS(s6gc)2{MI Jpv |.φۚ nYweCjZRin%G蝡1th:#t~[o5Kh3P.|*9>Ʀ :+3i!#aEK՗4A٢>G|0":ǕtJ5?F];ߜ~ͷ7/O/ߞ| {꯯N|?I&{|MK ;4 *z>໴kjvyCb׳Zg\[[?^7gS,!qL9^5L=+(QWl.;t*RřQ|vnqAlpǦ}4Sbt4j#//m<1y0t15}1~`?3Adȸ'Y+'q~⧧ߏ%myyeCL oKǬLFGZ)ӑf(O"d7>b98kBK7W6YFZC%Nɠ<z˕C)!+h+AM'i\U⯼Qԓez+%Ky2';CAF'@z_v_<ܩJ~'LCqF~H8&3JfVB#Lw܏Zڅ3+Qӌ䢥L3V!!iVE9Kg:<{9N=܇NE,d Of!Vһu:/zohsn9 rX0rmrlu\]Q=*>Szbo]dkr2v950xw.@ew`>ȤT e~|4 n\5$b2~V =0> ݆>G[PkoBf1-^nHׯ첹Vvtw{ כ9u`\^qܓux?ʲ}3󆔍dTԙpY-Rp,,lƒ..i֔I,,rR eCpbt&s#9x'SAMjU3Op(2E!}$m%=|*sҲ\O\bmh% ntRJos4ec(lͶE\9^~X[a2oovauwrywT=ݣyΈv[U7_;Uu3'=n,GJ-\m:q;7N񴂑0֑5Py22/uv!pFtg2A` 7pV&\@pƹIiT>#4K͠n셋Z+tmr: ޅXrǀj^;%XZKks>d~镍i?W^o؊o}nj# 0ug`M ǫW%Ť3nDC|>3羊ĉf6 `E8dYg4h*9^`v 6[P2JUB;Q>_T%Cn65hvYJ{H8q;gU!heC>Αv Щ@#m4%zg<鬳}(8]p]Q1a˖T$E}@oAr$B3?N֭q<~m&m{taʻ_]-2f˪TPKHc@TߠgU3),C-qT9* Y@g%& Z앟0%(^nZ3.WKں/XsSO+Tpܵkη]5sfVŬy:\Ы+VU#+ZvZ7L+֫ɦtPo2[YiB%0.t7hy?IWE>xEJG2lR82MsoWLẂEzхC|8}TlXuZub*sݢlb8&lsVhLlKyufuwR]n64^_7+" $ Q.6m98%&|rr3'}N ݦcU5"}:ڟ(՞ 8L_hv-!e\Ycnh\Զq]T} DO'k\"j@*1a`; A$\)‡ VMxbqHY1N0S )ّ>g{Q+R1lc=-F\iwt" `CZj4pr_M$:GqfI S[4"T:GvR8H%U՞s\t ޓ׹nV-S|NQ38/.=-x|wtY({ZNy],0hi>h@mGpyǴ NY/ Uy(o罽6hX;;xW 68s^\Ŀ͉@<p!f\q{0̗uI᜻!5dm ܼº S_W*]Hg~Uv/ gn7klu"OK q҆6W{ґsԸ13E6"o%7δUC:h|v]q \Usr:'`=3w^b:v7pc3`5q oE#h.c$r:Sm]kX 4x%|Ƹ)E@aHG%*E2>°$h-D<b\`T:RD)E٬zfW^l$=?s}{޳+ʛ$U4Ȑ ZiILYH:J'A%BQS⠋"*#ZX4 'YKdjg :%*I% A42g32RNVƮXH{R*Sp{1 ~ۑA: j:aCGb,z9)(bs4Fs \*/ID 6(`d1ta.&fva&[juiEKkZ&".rt #d.1sUuzʖ{'!N !gHkQ!քHdi/qD I1xXLx }AbcWDdUD6X&M*EP5ڳ㉈ 5pVSHG$AKg\p^CQWry(x<]J-Ҍo">j8i*0.zRXt&Gp{+HIq$o3Ɠey/2(Zq? nxx@9! mm85$E͂"+C(R2-s:d<dH..'aO >eaĥ\??#R+3.Ynܢ a{qS u;(a0yY𧓞805u{~,~l~4\"7a""Ab2Z3&t$6hU€N[At>EUW8scc I)L*$ <#:j>1cJ*Sibl`QGL|Ӂ;w8܉Ж(RC` NR7Z8A:|!"&RH xb'iy9u]uLenlsJJMd3BDN t"CK*I@+f]m>s&w]O~wDVo /A7m[X3D\~t>ăNR1!1ih 8B#8 $FAT?cp7B({*YoFt,Q=$Ab:-4wJ&TfW$H1 R7l!B7sݧ=Cܚ>&,{k3%xR&pl:||T)t1τ$4Xߠ(x_S^6=7 h J f@8hX«Y~$Add$/;J2GJqh$>aYdR׹K'1q]6 &Sn|,$G>S$b ْjoOQWۧN׷VWӜe34<ˇ~Lldwk \1E)gWY`J \U WU9Kr6p\ G+PZw fq59Bi },%c>"\A髽~>uHv V QdqVB)wd6_%nהyuV1qk&$W2 "{̥ FˇffMLηgJ?^M62htDc O4:x m3KhI&cܸӴ5QZJTT4(o]%1R7ZS(McN]Ef4>_0 N }}c# q D;$Y(ךYM^n5%}TVOR-| Hm^7VQ/[}!io0ﭾ”,d A|.d=C( 9 OUoNya0@Z?hr[5{b* bp|N bLK>^w7ɧVX{Oκ_t n݊%݀ t2/mPhDyi{sakb03XVM/LC 40b hf Jq&P82Gb% zVBo \t4B?l^˭˪#_&+jݦ57u~iQeW\U1bN1"몡+ĊUJ1TuZ7L+֫ɦtPo2[Y@ B%0.t7%fqE>xEez۝"i8pf]|Xt?:s23zqh5N `-Xx`+&yyoh|1S9f4Y&ݚǁ拦o^;e]c(TkW﫛 MW 6|u̼Q=q88)7t?HkGKDȩ(^ 0@X>&rilω.1'+aoYQ\iYV1Uc[}X8pAXkхH!N'*Pp^g _hvsq=.Qtɕa1VEm'AEiͽߧNI|^=Z(8%:7JQQ%& lg$$s^9@`o<8[$,\K;ۣ܊ GNlteZ4uc3^0vWqj9ƱzG'4gEHVß5gN*NHY > !0ϐlWᏯDO$;h"173HJߢr5;u,Wđ@J,yX[9h$Ƹ,pS#&ծyt_Wn ;v6b_~s3 d5G2ʹ=)|6:MqLB Rv;Xq4;xg 6ys2Kۺs{"f\q{0̗8rI᜻!5dm ܼvХ SKZ*wh~U*/ gF%:klu"O"SqnPJ)Fʼn\fT]߷I :i&ruPHY"!(OpޟTtْo-dhS^w'o>FI  V]<<8F"73%$ &ɝa!ʦ4Q#( Ò-Gp$qPDpJK4gaɆni:c;ۼRRl<|7~o_,L|g59nLXx^-M ehQT|ƨ4h_472K4ɒ88tI-pm6|fgo6{/dekH^K_WɊݶNHd/ /ohd2}2/޸V%$c #}0Fdp)AGrNpK ɨ.Y>N=6[~ ŬNIiUݦ-$ ٩ށ% %,$=vglP خ/^[^{@4X-PHJnل_X) )!\&X1ySkF)c_@h8/bd@&ر?LaK!_1#vc8xĝqÖ2BJH(:'ئ;/UFPRӭh؀.AN=bR뀹q1e U?rB˨Bf 8 B3q>):b:;ӒE//~q'&Zm @ mbT@6XXɑZ(RE%r/'v}0.dyj?=Yye ̍^B[QA{ я/hϐm|J0RLIٙl DjɢHǜuz^?ylӔu)*fh9PF\ I&_D^x=|4؄T\[-HC,Y1f` 0$IrYu}3q7PiW{_ ^(ݓ}^8x2|6)㵫0 6$B09ΟqhPb $BCTn+~@Comz'mk8dg/ h^ج3@ 26M$ !i;My8d!nO>}N;OWn}}Y{PY7׮\]=ӻ ?@z=EmY`In[ZVi{r[2!)B蜙?g @_? *tȚrM.7FjR ?C?>pAPr2U<1ƺnF!s!؉ Ony>mP<"Uh/Qe)w]L_QւW j$ʷ_;N͏OxYeܽ }̏M>`d· MM8$OƢFaA~s=7`he2`E:2P3HB9$*Z%#b)Tւ0BZK$]Р,:*Ku> 5 1L=_Ec#}xz6m%az5't&/N\>NDD:蔪khLRN9h鹔HF ƘԂ1H5`K҆e,/1PbȣTt,A#lgow7`gق0k־ ݜby`; 1K,o OHfޥ! om,JoD\wʧ#];aL(.t=P+]IeS[]5L^; 8FR2XHofE"`2 !x֧DnI_NNp:rwS|S75Wn}z,t ҠHFWk*8lIBZgSB؉/;Ӊg&}CBj{˻6x6="]5< s6sGiQ8X$M27hHS IN|RaT`T2ؚ_"hi4 kU@")Rl+ n%; fF B B*tńJG$v.ߙ8{ԫ`'`PF39 88NG:Ӊ׿]#}b0j/R( cP t^H\%B11 ݊v~Vo ޔLEj|c[b- v4cbEDc0(.aQk' ySsݬy:cczU3 yu:bl4-ͺGiD >Y$ &/l{D64NYo>v 7IDkl_C+ 74{SN ~b4]M7 &y|8qS"$S!aΩ$H $s 1%x&+\͔J|`>imExBOI9uɰhFܡW <;>Zy} &S"H^<Q-0JX4%%Na+P{9$]/kiu=GfTфr|&e1e(e*Vj9z!cPeZ8kl6 zXQo(g)x*¢P<"k >XBեW/8hsMM{RQ+* Ce%"/a ߩ4I0RᵵOeeYi|^Y_i )Z+4GxK3Z vW55H輮1R up*?# s\>IhV?%jaQR,"d8_و/\xԦeZ*4QMr1:eOZwϧWjX5՜Q]j&3 rvtm~],^xw8Y}*ZC5s2d|||:\>++7jӟm-'qy2{n*ˇ2N`}ujM'=sy18^׃떱7knڱa k2Ѩ.Q9-2F vu#:?}8_ѿ?ÑT_}O'(Xl?Z.c[n7skk@Z|4u<}ޗlc.4u\QPu<_a?/kI#\5qTR8#SG25='_>ۖ=0ܱm]55H麉Rd9fD#UWy\wm'AR_@g^tAybY4%_oEI2ɺTF`KEic) l`* WGA m´HճN{SAu& 5_e_H,څaM^+wo$^![Q>Ơ7)~c!e⭼,pTZW:ĒzeRɡz7X)j=|$wPɜeh@l 8 AC頡tPo+W D":v6EB&[ jVk6 dGLLی֐'/C)9)cL)H$r,#t &ٚTcJۙ8{.,}[Y9]]˳[z&]lwm_ά"SJ_%69!  FIzgOW~O@&*˕dry7X͕+-Q6^sTvS7z^0-\bRx5Xܚ=_'S,h9;l^)(e|eZXM-~Eyخ-5h+íxy9ZHU9;g.=5^٭^wmH͗[^737`l`E2Ds~ֻmI[db=U,VOcת|{L "W% x+c&&vHPl*WŦ4ϓ -]_L}T֧*dw~wR<#Ut|9`N;R,UNdԗS *,"dZe[ps.1Z$*,V(ZUr:d*[wi-Ƹ,4޿8=SW06[4;Vjb >e4nmnDZ.9xdR(JG (1}&eS:`NUT )G|,+L6e:C}H`Ank1 F&hم9pnBlqG_Nvg^q@XsB}~'{gy]"7`ٮ;#yGeZ ? cth4$Zwg8h#c%hkj}wHPR-& WFM)O21Lh1tc#O4. .Pooi͠'e<.RQzӰ7H:!nImz<iZ>|?}6\v,oKqiX;,jx;(jI߹!i8b,ϫӖ,vt6c[p|fg]{k=ԳC^8̢JvA}ϗ O-8\)% :]mb$OԮQu8X]:֋U2s%&Q"`( ÃUV^-21eh~8h6vzq`Rvc:<^*j|t cN'fԓI3bg<4#Z!=HR.7f1ִf9M*˚shZN{kP WXyQr_Ґtrg-H~ZM %? lZA`(cDZ3L鷐uֲd,,-02, 1,q\8-<%V&n#Ȭf3 8xH|>N29lEw1 mf$X익i}A#ȼ ^k9zxbp)g#(%͞ ѲtRAV%W9Ns>I69zQ- 熨V'.9KVqDR/=Y]`l)fY180Ǎ,"4jy&P&6+*bW;7btݑo7Y0 Jf#DwG+QOl oGSJ˂KaRg2< l (0GJG:ik#B]"w>Ӳ-B]A:;ȴ &A2BY21gb5*9fx#K0p{ U:laE'դ(VJU\ģ@,`G 6kj=dqæI-`Xm+ XKl_] Wd]u-4u}p'5D $ɴAf%Cox&B!e!8(hkk+۲H eL9B(ā YzTI$3C4Xy[ya_ݥ';mX)/;נb9E!!  \b!`Hӌ {n9$&Í u=[+uDat(dc9xe֐1J<єTE S2q_*5CBifJt< :a}ߪ#gJ "(%k<)NiH>h)DO? bz?jKVhaP` kb-Io$fY&KNrNs +Hw}OԞOXlKϋSdϳ} '3[~{h8;QIY`?8sHo{7␚p--"ød䒝34G\≟LE\L+]Z͒Z.>,Aclj $7^N2|wV5 uQ.gǑ)K>}{,ܿX6o??^\TkH)9NpEۆfJ&3Xm}E9i|6el@647{˫Whi 1%j|I̅Aln[XmմE;Hzr'P{hF4vifYޤi/4h,:=YٿOw NAbM6ՖduRr ?F$FW\ԧFCF׆yxYcqĎ7g߿盗gg/sƅ={Wgo^$5iR집{0 Dztm[t--^¨O!7(ZR$OW^w Źd}SNVZ~AAe^\|&1%Oz8ob+UR܈_|OϻIXl&p]|\hF/W&f VaX= ?k: A~&+>222bgJƉ6:~rhtwr:38fntdJk1%`:2 Yꌺ,^Ziwɢ52wɠ<z˕C)!+hLɰB$Z9:xx+WQԓ1eM [=NrN_j@Og*4 c%4cpULUE]]-GWh+W=36 Tԙǝ%,op$ؔ-A&12vS@AXL,,rR eCpZf1:̑ީ-1bki毡fp: _Bzya?PI0,Vm{hpY~5U!V'Ff)90W4{7=m6}IdS zK_KȠORV+zyyF~)yl .}ιufwMg-^>0FIwC)njhs ;zKj;w;K\ ݉W{n2xnZ= Vc3 r{hjC``&#[ǫE}nHS?;̓TIt|9-YBEfYQP«xՁ-`v_#mvgz oEwaoǟm@C^Kk9G!Z\bdsfA\F0[]*71 lh_z5h#8K- 9aB]]sr+*?%Uw"|4r!ylm>n!̕]o*= ~Ȓ, JCi^l=ifCMtn꫅I۟k+`TFn9룉!LHAUBM%gO$oqb5 ͗q ڪ b@gjV)yf  8 /UCeU6~BKr)@\UŨB"jLNk xU4%@55tn<"Chm @t{rN?/=|%xȈ-jqSɺU?3TYSsH:%St*q}D%KURM;}s9P&z&y}BVlMH%Re.+UP,3'aufB.z̳g2&8N_9oh B=qYU^O 9P)WT$ߤ0|VESo֏PfD~9YR-sqhdhQEzGviى/6׃no*0-5* cpb;b҅ir1{ھrc"pN?p+aH>]oӦ6k˃Y3~8 vN=;ˇN7p5?j8٨rly@AydLy]_%-Vni` 1Y=zO*z|Y;Jj`F{'qAT`UC1VS. u&<"c (5GgNw :_lg+qv:.;ŎpȀ p0vtK9}Rr@+:xB8!YʐMQ-TRO){Ss8kژ!VGFU5&㐕 W/g~nfx,_ł{úT)2cDr,ʲe$ye+ hw |G[o 8wX(~;7/߀F_@Y0㌯?J0oBP`MĀC dc-ܼ=毎W޹}Mjϓ^n7?cQl~dϷۉiBEܐ=*gTbHM袥}pU<8A~A}hrzZnt-pD-p-6֔1({m2|N eId)REc nM}ڷCrXٟz:Kmd JUn>q "dHF l%8ui4E!9B(0iPCfWSn?&̐dN9И\Iĉ{2gD25Dh)[bmޫ/c'#O>仞Y+zJeݕˊuꃗ[Ⱥ2S}q-߾,J]Qy#qȺ(_ XAdT 5gL>jTf)Y GZJMIyebp*έ,N.V5ڨXz#c7s#c? ySP2f+Vh \jxF`e_y_ lv;?3b(Z$Y>8gtrAĔ`%hb\6ȄVeK|[AC b1ʦdii2BB(dv00;#v7s#v(B0^1jO vǃ4PsjC,N2a95*ʩ- 9QbN56'PpL5UFB@0J^TWT)x͜x 7)_ ǂ㩈h:#qBĝyXqJkHIZ#9-YR}6% !+M*B,W;xHlIB`jÙTRMt]Y"iBPHC6)CgDf#?|Ipqi^g7/y*.Bg\.N HS` )،IXAńs]mck4mM'\<.'!< a{GW5rnQ*b$)eʇǵJ?*@9yߜ/}ε^~-}/8*Gڐ~WTQKſ^οکǏ,Os1s2! dW} B!sJC5l  {M:C lK|QQp#פW랟M_ȤgG&e)]^&Ņ|U$Wt%?74!Wɣm:uz_jl_G׭^;g_Ll)!,ZHd &RIpb XqUئt͑7`-`}k W}*-,w`8zT:w3瑒fp5rQQxN7ζCk+rypTt톻jƲ嬏&3!U 5JDFWXKi5 ͗n< / !c@gjV)yfMR*D, Y0TVM퉺'Z6&#Bggr_m2be]X$xKG '[$HG{x 7q7p WL蓡/o~P)!`F*>(+ѩ+댩|2.]ݳĚ%AY~NKG!c0 emyȐXQ$=zP!f=+clHBjcۧ kE54ܱWcr[$άvm<ύ{Blrsўjģk/坟V_ E~ U3ɪ؁(y3hmF0@LTCc, !M)il=ߌP)Ͳ̕"{4#mat" ¥RtV I& :+5#W7j']PFNx)z{"t6Ѱ t2>>keps ^r}'X-q&G/BzfpFTo&ݙ x]VD+1VJZ:+yKۺ@w֢@w֊@Z;V*G {CF,{Nc Ő+%s'%ĺR\H\֥ȲR \g@cB#H4s9cD5 "(QUy}}?9 @ࣃȐL+pB[ZW 4q3hlU9sɼ*N~v𳃟-7* SQyM  %YAI3 gL$5 )lFe) @eHYrcrYFC8 JLDCm~6FΆyޅb8r"2 s/RWO~熙]+t]"]O&m6*zrFPI-iGüJ +2_:Yε$di׭j, g%Huv0:0:Y$ jc$Y!_j&>6d\+Mz^!v`CxmFx[(¶ -iYmn4EڎVX7[NAx ,FPų< WdlYcǓ]:5t.?Ria:;ퟎn{5ȄpC !$Y4xMX4B9$-eBЀ'x'(//R1ت0u0xqvAo5 _;kOF5/yBa,flȢB]`殄_XyLTfEﭗEu'}mYXTADХR/$u )\1ʹ6tʎy-2,S:6[ Ǡ "O!j4h2L"` ,x˘E9z"JQ^NKs?+:MT#^rܽSBRK_- a/a6{!ҔGU%.`_QmTR\'XYjIHT"RE,<ٜI" ,NrjZIp'g/58 $C43Q.9A`0dJ"r@S:RIsc&ln{Dz($Č#'nu#&!Z&u"+̐&HhI2ger9Y-W_Ռ:Q.diP %k^I26qi%B8s`X "3 &EsTQorY6 Fd#Q'Agz}Za|I4v_A)kfb(mAivlEF 9bF!RS(V]5]6#M娤PgӋrc@T2@#f4dc2dD$G4%-U^ lq*5CO @|<uzMX7arV10dɠD嬵UZ@>hY,G#Du]Cښ6$Z`!`vib-iH 1&KNpch3cfiᵳOԆ0>/\H}UEF&^>=2 DoΎ2)moYHz7r 5*B崈㒉 vi Gn4Jh4&t~l5}6,9W%؇%+}ФZTsri8?ךB>|u~btջpdOWijxjfQE VXZѐR00qItz/M!ksMs?>?u]det1k|N̹rgO&k+bmNn?]Z oO\gX9G:_5X9refyƽZi U,[v5[b)X=9֏:QWd3uT?P K~{ITnw'giN/Nwߗ;~c.ۿ=~ mP$<_Dpkz7 mƛ - l0.h`d\lr+ƽ->-rmaZ z?]~yӏߝ€>K1hkNROr?g_!I'lOG0N8C78?ngcݳ΅/q_ G6u:D71wD  ٫<n/񪋱gI ##,vfT8\(Qx林e2EPb# )ӑfAhGKQӑv0V1+02h̠ W6H YiE/$d>D]B5r *x+osuU7&=OsvVҸ7M3u\xPP< {k a,o }.NAk^Կ_S&_Ls>)W#زG'i~!eY|H.p}Q<.3gQɗ|Ә䢦L3Qz>Ҫ՛5rvvyj8oSv*ʖKp쓫5!=KBM n;َ?'<_͌Az(Kȱ Sm׫ϓoޤW'Qg^ioR/3G$A¼IM3jQGػ"klNp1O7|[zKI" *Jl*TՓQWfWoh-Qo#=o#Y]}ov+ tjWυH]>uUȝ*>r}"*u9+! p *rvuU\b;u|ԕA*8uUȵp(ꊨ ۮ m;jNHH]8uUȵZ͠PUR҂4 x8誐k슨UgWJ.;u Օ? uE#gjrBmWW%SWG]{7  E]jm*T?{Wƍ8@-@>\{@>hZ܇k`V#\KN? wlkeIRT"EəoҽzJS#pT}0ep?-ꧧ8QkE,Z¸b6Оt''w;٣DBbzu9JGS 8`b0o FB>LF)'GU=EY`oK4 QŌۿRsK#޹" $F (̊hX(UR|!)G*JUQ/UH)y(K=>{wX9q,-" gsSQ?j͒fʟEGO+˝!!B2ƦzbE<  EɥP,L-{&S{9Z49=L0c&WCQWZ ?/zԕp1Twڎ`Է[oG-OB]^]TLݎ\E]!T2ޫgXN&GH]!QW\NE]ej%wu=z+" VkF]!FCQW\Qľ+R(֫g@ U&XQW\E]!¾L%sTW |42*K,D^M2[baEQNo~r:gXi>X*bsbٔ_uld\d X(|i%XvnlcwGKfV*2%?}bӄ K{,E7yT&K(Hgq!5QZJTT4(oh*a~%}Su/eMs&$hbw=Z_FUۣ_j2&fs;g1oE8uF/U8k؝jb=SO/ljJU%ח!G+.O >sY[8GUϛ~;嫙Zl.\6hyz~״הϲ&<6fvu}Y6-%Sh'?/ '֍nG&57U7M\YSܮ{MRyKrzg|g+Vrf\w{Qkimɨ59ֲZClCk sۼE ZodyF:KH*41\&UHLF:o#|tF:hE9Փ'"g2%oµ3; hz6i@FI@ZXh.P፧:c\k\ "8O` yH^#}&[Ӌ-Nv٥`5d[Fm =#/ S4=L0\bzDxRN?ǭjfksIϛqQ@R*U"bXڦޗ=mon/riSDъayzP}|oX+K uF] ^dDʃyU{f3Si'Uz)oWgD8::RZO*$9g Q^$Ip8Ocs-ȳE "D "x[qL8& e 2j& 2NOR+@(I1h8CaLRL.]rucvFg֡tc89e򖣟L qWp奟T1[:i׿\pU_n(oϚZ/B*#$$$(K:<%fѤ(4{@)Q$@Ts5O{9;`IHܣ qD#GYdn^SD9@ns1IjRHSrfI\1;1ȑjȯPa a:*e޳$4 *-X;P*+F/!KY/8)7ZLxX$p ,.E?%@O?1S{3 NpFA)pS\rzQo'v\xv+`d𦗶ND뫈|ϕ'"|Ȯ§:3o+(!#p"$UHID d c9W4&Dy݋G71c qȮҡ;A'ycvUy̋}WqH}7Hbn׼˯cG(%F(R8&59@^ -W|<|p]V[W^ {Ķ UmiqkM˾:x8q$͉&ԃsDEEkR@UBw|T$B3TͤAw YV[8QȄQ M]%֜Ͽ7Xy <>;iY[-k|ڗ Bъdwk"@H@KCqWICCH/:~Q!43챈xeLȔJrsr ПVʳӼY0!Sutv! .m%ٳYB|W/f_ȚKNɱߨvٴjhWC\>GNئniV{PW~:?.*xu==}z- &{ω4sbnNnf-^r>XD ֓a=XE[7Y•,3˻8X+3g1ortLdb8]38cd} zmήz$<ϿW>7}=q#zL:(o74J_Ɔe]<4b'Zsaߴ_yN5OQgie4E* <*>F 23iKΓu_tMkUbX^y0sHB \`BtTyhP\B핿rg<+ruUw=HvVOo%q'esfQО8C_~1 , (ue=޹" $F ( qYkl&+<6Mzvdv-A])8=e:kpAb"h.q0? JBTdvl[r R/q)_B]dhh<)˻'T!Vhx0kJomK_ͯrl/UOv6X WuOnS,ٜXP-i6Zd+!A*΄Cr) ~xZ#S~W)C#.4WمDa O4,־}5 jgqr6'#s#1q"vILj |RahhBɀ86P0_ΐd1P#k(?0A~6>]~f瞓Yܿީz{_Y$Q]!ꇯxDJU[f^IeH1 \^+YEK)ZDG#d^޷HbxNv)SĒs:F iS F#Z P!M$t*@XC 40b hf )(I+Gp,PYNJ3r6ȑl4# 9it?6Jw`dioKYkU~lOmW'_Igj{ȑ_1i0[no/عrEWHNf+$[~%Kݲ!lSdc9w)_}I)FËel‰ȗTYxr~X = y^/0;6R]l}eInw)-t={dǓ>\(=>;&Uh>+*Wְ[ZդꭖUrHkD']iɇnx9_q4ϬLF v}0SAǏl7[aǢTz}TteIlruNvH٤pzRi+h^j}<{~MeT&n6x5*6u3(]s;a_ŃNu%OQoK.m[n哣OwRolwr=Gkw +c`Y.VpJvҖJLĨϠB(wG+j^awꬋ+Ӄ1W2$YH_[`Z]oA¯d['ƴi)t}ˈ.׹8ݨ߬8}9:9v빰U[y8Ƙh5J[[pC_IEe+[꫙e`ƳwZ  x@ae](ZVq YhVuiQR SҦM)zcp[,F39Y,8bJ{*/-V3gC{!{H-B>:~{B":|AB zX4E2oEd<-CL5!VcmH21"K^~ɉCBOD !W^F0RZ͜51݄hi7 b(@.pz>GW1 v]e>(Mi>akwy|luٲ]Y#_6w OvISA szws%YE-̼Tt^mzLw@eoLWoPo<\z6%\̉kWoi4Gs!o:u :ȗK:u3del\ۗ4,fTx!(uVg !,Nu}׫E8)#/(P+y{W?? 6' nNDAT@CU+CdQEt.5*z.y}r Ho;w6_i51ENUk,ڢ-w9fv},bb羜ltV}vFOxBz-N%}6!h y) yUGp\Uru%@gg'˲ &ePhPjW3ggq< ގ=ޗ`sO1˸YW4ۊ}(4-eoV>ҽ>?t?BFM3e= 6L;̎xsA<Ǽ3#vG:dNX|1wB u2 nw?Ӯ#qv.)Қ@V0!":x1&9sI$2+  9gwǽVg4}q8]Y?{>0\] zf[`oT!{2`dLs9ۗbgL+Z+̘6F `{W`{W\Z|pU++rE{W`zsU ]+-6z)p%f^kr3Ճ 煫ZG>aVr1,gPjV6[\U1W}bCb ^!\ =b0U1W}bRCbWJ+ vX8g|k+VpGpE?UE*8Z9xvUlૄ+ իk}X9u,Td]wIw9Pzϵdŏðbsa@n1 -FȐ <?ix30*2q~c¬nN4`7ɪ̽3ʡSV \-fzxC 52W]zk{q>M\_?gm~Qy<~³N`i3Bj#,FPG׸9`;,/m:2w6g)y m>ÃwL mi#2»يt KNxg7FU<{ţp 1C2U6bX3c<ꖯ(=^c|ByĿ]PTI8n2rP`6ܐGU3@M&6B9&MI}[ ~Dػ`D=A!ɒ`'/BԈ3.eeUefefeer` u٠Wֶ*Yhb)r[ܨL܋5mpFYkou?N2ˡm,*IqHL`.=DYTxDykE?k!cQm˛L&) @0|x2Pq@<j{ mP4T ;1@+&mJ:K"# e*4\1@"i|: ,?6R!TxԄð`s\D^LyLcT0^zp SS.9$"|m?FRP7/L (::xK ET;rZ=՚fN9qKWQ`NF(vͨIdk 2>JBB:.k&^q\ެ7٦kFU@nTˠ]Єbv8iU9w<"$ z+l|eB]h"jA8X 鈂1,r m(8DSf"cvcen4hB2wՙ]H~dyl5ģD >` 6L=A*MvǖPh3A*exf%C%BabP^IH?Dʻu'fxԇiX>Cɫ7CADB:(rF'vZlCDQoe0 6S <˯9Uˮ"ap)i&&y$LVi@(,<6tkYy[40Rëtje aGq EZ 1s;Em !8`jh0i!gρJ/:I 09m@=-l+'|g?o~_]Ȥ920R};  OWх'w߂"]ד&=1˺nHm7n$z> GЀLppSMefpVf*&yӔD(T<:@>ߏ&.(ި2?. >P4XXo<9{ާOޝ޽?9D}wtv;30 .EA=g@^{Vmu U]St Zu9Mm*f֜(=tǻGF:̏c)Y | }W_/iϕ Vb%9n 7wn']u~4Q.hFZV^]ccD jiJvc~Vw0ߑY3XAF=,vNjZgL/~muGlqr()" 7biFIe*B`HxrDDu(tFs{G` ôv;^YFJkꄌX9gQ"(*Hd!8PCVv*^5J╷9zUw=OvV/o|HWx}P9ψ@(cF3r ]Kj=T<"#Aؐ*N2EGc@pjST21#hF`A.&x$ )[X1cE佖豉hj4"nYGl O!Jx&wX~ U7z@r'hZj{ea6t 7=rweɵMSŅ\&)BA'tЀ./W!ǟZgEa: @m6_J^;9*gI{o?2ka2W]サs c3mͭ{.?R ^㨜f5iSշm ޕ=ytH/cKĩB( 8}5D(`ˑb K"boĄں+C|ߜaG\˟/Ocex[p8j/9vf1|%gS4awǦ~)R[s,=b 7z򶙖Wn C!fH ^ wE%R!Q,} *[{hd$ri{w5?^hjÆ^J>_ao풺$~m\wuK_ͿÚЄة ZnuD۲JAv8'[Xn6ZԵlv Y'(2.scNo?[PjdQ2e`<3ERT TyV~OzVKpUsS s|9hlYG D*`ǠD:e -HZx+MJvwAoRޕ [&-cз5e)xV[n3O"qO-b{_iר4Q4Y 2 1*'(;HE 0X O#-ZќN0%)FhȠ萵*` - h\J1 n,,%{&CD{f*yp`NJb@$#- ,;$;iIɻf9jh#1J+ek5:Kwc'}),x"l;!,jꩲkx`P0_2{((]hD3pJeKv%ǒZ.7"֜cL#E5˯(JwaJ)i$ (o~<M^2e tU?og5l ǐחAͨ3жe#:m-&UU]"ɇ;`U-XьVHH·;HFRyr32x3p FV )hތ?(i iZw1aKyȿ3JQ'yfxr|J>NLSYx{?3]_odBA(|>o.7'05t; 4V_'?_ӽw*yGo~(V˃y| F Jq -矾,q4\^ZD/GHeh4aN4Pt%>wR3枿Iw2+cj7ZC{{%/v$w[<]"E/Kc+㔆fbQOB%\M!3rA+-;/^q";Rsq?A⅒Zp)ynJ3M"43[n@Ei\|J)qt-hv sU";QHMNt 'D:0TQK1B!L`Z`N#Dv%K>}vˁo2%\FQFm]L2 Lw/ѻEײT[. .l˖r\Rd0NrЀ€^(&x+"1hbw:PR#8W4 If2Fdo19xHihAJ.c:@M~p3!Z&4v'=RҝuUN EFN6Xrh* MɊyAԤREհ JaNQc3#M!N01[O3E3: ]&;3qd쎫t'ٕg싅'+v wϤY@̮ 3޲9sG]i  O#)NJ#1`{i-@ƕB"4 y%q6#=(x/@B0ja (cƒ6U9̂6#vg<~)0Q(M uYHkx1cbF^{Qh :j C[ԣ>a?y{ir8dQ9ü<[T}Y>'yOsgQӕۭn+;z!vdcGr~Q9aQrxQ M%+c%G494,fTx!(93Pܴ1S{[vr0qRv`_), olps4S|DKEtU9$*+zQu]'^ duv2x4ִAÖ$K7 8,v#ʹeähMtG)_E7Emfms"tjYAgUc]MYc77n9l.R"z_ wx;{iݾe%` k/SP6뭃o}Uav" VO~}̋.42a~!v#`iz- gonyma44ڲ_l9um+{p=>N:_km~ovjݝl+z)+[yq--0*#s̋ڟ(.ܩ\Trc3}Odf21rz3Wg|a⑭]ҁìp2ŀ,ke{| 8k sV_} 'BKK2O9 ϢK79k !wY޻}ihav7z;X(y~Nl]j| M,;j`D]OF9 |zJ`o@ҽI;3 o|_OV%[`1?zn׳aw`mnjN nTJy/i٭G~GikHmi".9b1y?g4 D4T"=)VrQ#Wa8/WO3ijxYz`d`_pşWվ]_jY{BpEK'W\NVUJpm)+2aW\s2JpÎwx8\+p uU1٩UVAXij0- \@dઘkY˅;\+QWJ xJpE}C*>ApUEw*V*p-J4/.ڃo+77[!w7W8=ߦ]JK4ih&d<#ix8LsnwKn&pݎ?`3(4P Z fI2HXl]<˜<`?<8_qsP[큭'nz՟\_9=ڌsp6,Ô/R,ݯm}m>/8̿N"ל߽M.84kUh -W*\rZBUh -W*\GrZBUh -W*\rE*\rZXrZBrZBUh -W*IhO~WT*QrZBUh -H1fr%GU*\rZBUh -WWpVT*\rZB˕ii cw_eȅ %fѥQ/^`kɒ;m)Ko77eA`la[f!+7zTn1,%jT+h]:K,$z&,z1R,/=)PxJ!B̘2pcrB/%7P hdf;^̜ M dKV>> 4s_M.Xqmc_36Blb~:s.}=S\ 2F1y-NQ8Fyr4EVKdp΃te`Y,Z]t&Yq6{, LNkZU볊gq<gyK {';_c)YPd LisBe`ɔ/8Z{Ѡb*bY{B}'W2O7$O#yWa)+z :hX@.~H ϕH ϝH ϖHY|*)pQaZk.sosJqICN^Rr ]RdYJ_mF2f(`{n1! 54I0]bufmQh{VڤL3f}ʔɕ&ٿ'P&㝵`:eբ.fyiqԺ1#66$sRӶIsKo,d) NWyuױN>骛&BK &!:2LEFt\-JFC5ߪ| R>'m|8,u^?ߔ-fxl];ӂEhzvxH BA g%V5^HI7jJWMiJW h #:瘶,G,4FZJEI,Y)KO R:Y#n6O+~, p6ec鹗c69!ϐBn=gwG/D1 DA/ɗ9 "t{BJEDxtc\K*<τ+Lτ^L4k l+@oPLÛVrX3yK1bm섍=g'F;ǔZ0Xb'?y&ɡzXvE!-I(ID&g*BF#&?bViTIp1gsY; K3)0q"/$ S/➠M>Ѕdfq ZzrO(iE*mP mKJqs kFEؗy({/.Ec :kJ!B QޞoTTeCݎ0H%GUKW4Ȕ=5>9Z4 Vʤc@Z*)@Sѝ DLJ #7]{(l N?5%+Qa1uYx ekL#r 9 WBM6y-Q$-BD Ҫ UV+ 1^ B5X+-sN xfp{܀yr^KQDf"cƘ1C8Iɇ*_ aV9q]w&>rZq- ?f !F3B&h>H`"" /^xuHu@| tdQt4QrШ0>8bPփ|ƂBgsP$2dWM J5o 2AɣJA&Э&x^m W|tX:NU~T}%*9U#E^Y%.khbFP,Xn5;ҷl,8> 3pAȒ\\Ƣ(N$0SfG12~*j880"*p(yX0°l" wUϲMtbц,Кϭ i0ViמEwV4IxFDj3[6 jTjJ1MZz+"w[N4wԿt Hj/2:hU&xbȖZh#t xڹG6MT٭ij913I. @fU:9lѓT$vhMAIP(xxj[ѭϵj@ΓDjK7)zbm؈-Œ[kPQ AP`AH ^Q3ES'2(8`[ [Ǝ,BĿv7`+BqҔ*SI\ \uD E:< >PS&ePicDMƂG1dw%uu=E@ =J6"+*1tVlӚYe3k*U2<+^&&#U0-lisOH:'/{i:w *TwOb-)"awU\@`Hec]TFZ6Z4cԋ@+y!1ϟf`J$nrmRE lj+;nc >Jʤr3pil" (ciV)*"0rҽ:PB$TiO\'3)&Mm wRG[l,Hxyr+Zd2JΩH-j; mA(+vw3ݜCB졥pjuQ ]ф2bZ,|yNK ?W7W qQnUSCV9xA#PT-T,ԚOVM$>7PȝjWESg(MbH'<*Ok*cCr-U^54g=V^ungs\ fG<;cB ͉DUj5Ev+E2AmE5M!3~z@/U>"Od 5;D 9]%6gp$Y^QBGU7wy@nP.O<:xpC#gN $PF3F*TUtxiuHIҼR4Għ/w Gǔj(AX5X`k>I)Cͤk+DW(h7".+i‹>"|sEoAEӐh> G&k)Ϋ[՘Mp6^f-~q &ڼ|Pot>A-VJgy!\2͗LWig 2 Cv#jaCvv!;;dg쐝Cvv!;;dg쐝Cvv!;;dg쐝Cvv!;;dg쐝Cvv!;;dg쐝Cvv!;;dg?_Yk?%Ym,̕*qG)ZIі&;e<Δ(1*nHvW-'a{z+_y|ozJ?Nu]^nV vlHSH5g>CfG c?V GW\bb[s.Z^R\Xb 4*MGRU)(<)*br%'qu B"-,i ;I؄# u{ )kA2{U;~~1lO?3=5?ַGhxd:'ruVvC3Z''o;?' B=z)=OQn|Yţz'j2SuLJ`n ẢUO/{r;nR$H4GU/!tޜ?>xGZ;O/ζ訫p}w=FFsh.#7؆ kturwo_FKwûy~Y]xӷFnj]^R>_=O=/❌N/noc_mg^I=Ƚȥu>{=W/VΚS]tׅg |HAy,朸m=ٴR; ??n֣ݨOkl}}BǷ??5/_T?}oDz2E`֭뫿íC[~[`\ᩯ:p߷NJ<3]9~r_\?ǯ 1ZT"i]X'i+0ab7YڞUwZ% zr2sYM'Ћ2ᎥqYxQGƑ9tQlV171[Ncډ /؃/?< v?]f~]=A[Qqq,y_E=m?L&jUML4VL*x P`s=Vwv~͜#rC|!}qjX^MyN&ڼ|ux=oIWovwD{no&6^}MtU}}>ˋMb^}wMn:'^&BV G01ȻyavWnݹ5Ȼޜĕfu[-\|Wh֕c1n߬qz7[tvzO%;qֶzZNi< x]TՆ3޽Uh\5rwfzt^/8QMD,z1᪺Erm.~Sl0Fny$9b$K6m6XfXURSn_q8.q8n"vxK;Pg'.)rE}9w9Nv]T.:mgYʄO=6t\Wg]Rׯ@`v:󿆗9ٱj۹\ ceOCj0Ƥ2ۭ_ڭ7㔡flvVvKj+-%|V Y>+g|V Y>+g|V Y>+g|V Y>+g|V Y>+g|V Y>+gLJsJ~>2uZPәe37`.?l;M {$hsJ\JF$.%6K@@{T*^^cUHx^)I}_/q?2TLa&UC*T`vqJP ) 6LSvIy`heaȏ{ԅ|JOW&}jFN)Ö\V󵽃Μޛ;ܠ[AwG`ITH9dNYA%z|.R""V0dz$=s気|brXQr`Kͦ:LȅVqiۅژ:o\'v9Y]^s`x1aRI.>wWp2; >VTуfpF﨓.u~:x{q1%ud,><)9I}.~B:x ~7',!^_魃ޟWY{R]_5}п> MW2q`R0IuLot,3c?=XާPM [$]ÅKzۻwx5r}y @k4Q7 t𶅂Û7l4{$({~?|0K2a6dͷ8{iz-禥=˻i4~inH멞 w&MPLOmuƤvn$ƊPҕP 0lF\*KFEZK |˪i`V\hάXX,bՏQ n:Ma~mыt6A]40ó7=rw?^^Yزl9M&J4nS%Ӕ A\=0 csr6v EcrޡY=|./ci܆7|`@[l|w[mfiΠn{LÖM԰"=sA|y'dxODVo]1779A/Dot(p(#3?\{ @RIvM8F:ZPqezUb4&*"$6JU!^٠LeSQTғƴ ikӝ`򠭱!{d!x3& ;xG 2uqd!`8o ieyչ.YFr6rHo]H{tF]+)w65|dMQяZ7 ( tSNq飶:W12#XP1g&-Y2UlqWYG=Ȍˤio6(4nn *zKl<; JsNI)5J{%RuLf A;e62x 3 j-$M* % CJ:0- 92Kllض_so4\ܱ/R]4uπE1s'usAHp lc̋Y1 4C   DTd\0G:Hc2llQ?PMq_$b6D$%")Hĵi"̒lb4R"XpJr;vIѠi*Zad`(QHLd$&9cu*A4 J.4aHe",T[ >:q]",iE.7BRau ZAVLbc<'FC/%Ň{'}nD؆R-mf | h+яO4E^wངAz5?%7iQh 5T(¿V8d.A>GʈjD@9/@ۑRj h"* R {@R#.A=jJqTQ&&Ew9lVJH?/ 9E)G|θCZ;qա/g[O~|,>˙{~`p?\_Ii'H5c7\v4JfNhrVA7Zw MN˩7ݶ:7GJ)QV WbHbpV 1ITlS+i$oaSNB߯bz[F#r7ŠnW<%AT>]iR[=bkI/j7d0KcXt<bpYC*05Aw£X"ϐ IUp;o8H$1qV{1gz4‚S<(9PFg_'a頫g}m=L 2j__,;|F3xG֭y[(*Jk?ܸ3b_7G ׽ gXC%'JY*"Ry;C Ѣ~{~"X30|fX;kșEDr)`@.x̪~F,B0#BXYL^jʈwmH@' xriz@<%{q rHjdI$nqe{-^rFKZ3 %QbEb6B(~vFΎfc5M>srNg+"wԏMFh3$ʟ-Ne.gyLtoGw3D AP׉!C .l%3!Z 'zT1EkN'ǘv369jzQ-rgV']|# x6,ryWo=vufbyݮ-˧,90,++rE ҤZ&ekB el$%^uYen f.qz;FQd#-Rwz<'ĂF˨Fr,r"Y%uT.KES1hz;AZ \ϝ גspr4 Lu7WE**T]*#$22Y ^C x;c10(,^;yZM%BgŎ@F0`L/דrJB!A ;nq݉/c>e2C*c_l^VCTTM\ $Y@B6 G?6i.˃iui(NJZm+I FB Q(G٫uX緜?lG8n8#G6Vz2O2R"dV2d,ΔU e2f2! ihYzPIĕ3hA:fgxkgLFOR8yfCD+BV Lb!) h Nv#HB_uټ>P4YčQ"4&ˤ:IF5dD$O4%-T#{AܧcPĸiJdº!U9FȈQKxZLTZ;"R9fmMq$zD߳|=?Ҵ %- y$ZкpkXL3%Y'g}n.pɕ9o9vO=Sd>e>I $/{h(Bap6ťkF|3bԼ\c:/"?)#&4 JiճՐB)NE\gԚxN:ķu2i(3߬G燉)`>Ԯw|0dFݪZ5.N?5B+SBNɦfJE,vFtzB64b`O҃e'^-NVn _s?y4<>Ypc{sC6lQ"yy7Zr0Nۚ(nZ,bxR+ 2eiG@/tNqզmՖv꼄BK&3,U`E}`+ԕj≟ NOWo{ߕ_x^eܾ}חo_}?,T${{=_ Mۮ;4}ItՀҮ-^\[C\ fp%ey/Yoǔ]UaAAE] 6_!IO7*FfA/ًAXl6we7n\y6&xy<]W1[D$灄 fj~s{mmc/YNdȸ' _:g~~d2+Q9FgVdh/ ff(9JbJt瀐z#rׄǴɼ5‚9d<z˔C!dVZd#M@@YPlByx39|gJ61`i[) FpP^"g1V\TIV 2,v^A}꠽\IP ) F3?9KX0H)[L2LHv,bZen\J1H&BDcRP3; #vFΎR79+NjyhrT&ny3k7׺| Z.ꞓ|Zb:1:K2f޹tQz6w%V|j[M}BW]j}gDb>ZҊJWT)]-K^= xu헒oͺ&.[wVE{Kߨr,[diyzϻdws ^\~uKū5Yh^\f_yM>yc$oZj:W%˭͟6l[1R)i`d/d|a1_5jbSUe߯ϿVӑ&]4AxY_ͣR*/qCBʪl* i\VG&\UZZ7ZJ)Ƿ]Z/_N.m摋džmeS?y6*"JРH;+u %~@ ~Mt6>"ۄL [սIu]Wit0?J <TScOk:t6WٝIoPEY%E2+/tv!0ݷ}&=}{&a&H=dɠ9C.3΅L EPT@zDYh칋Z )tlr:ޅ(T1 XNqHQ[gl}[=y[I'M/aԟ+.= &lnczzs57L1#цf]tloN40,p8c 2ϖ7OV0T )#Z̦܁ACPe8Y+ghWqUo}`} Ry[k2J0a[a]):rPg .a5JgnAj( &n,XZHAs8 YhCO$2M.!B9AۢF,S̆|8!Ό;ͽ[j{;;#gk"ౚ&EiwhZv ̉a uVryÜ Qo]!5D"iH&Z=.X48:erzUV)IA4B}_+/I1\N+ x!ؔH4 duJY2)c(rm&{f$Q!9iHrE)V2fSKw:#g=meia2dpT͵պJlNQ;jN߷gx2-QNDaWJ2-̗d/\$+ϽFLf؃Vk! .TFeo"J 9hL1(R2>HIl SQ+{ 8Xs<1YksHfLdh8͜D/G k:I #uZ P6> um vx+^B^^=Xx4VR'CQH!L4!2$^f3r3t5~ȵ摴[VChҨƮ9u?s=IfS8:3cZnϻW8<X^}j efCS ZWf?7k~/JKԤ'r XPVPɔ|e,PB&#)*f|gQ:Z.i£WKB?~Y}Vqu}}5_4,&8~V|ώӢ~ G޵q$ЧdQ`|vbܮan0)qMH:VW=3%iG g=Byzs~>&z0n *y{CoD4u16?\d'l`^9$O\Xl Տ&os4B?Nw0&l5yے AzYpY7qhr5na>;ϰ ^TѨݛ/Zeauu4x.iQPX2ӟ86Ó<,8צg+|h׭r&H f&>7*OjF':nh2\o@fi:ޥۨ:B:rtx;zHV Jn;VJx Ҋ}GC1E52%3Fque9v@6O7}۟ԝ~g:@ZNDǧ;тm+H"Gqf7g$sĭ<']Ҟb\y}c~㟧9F0B13M/fp{fhҠmL a:EL[z.z%+58s ~p\0j/j=^%4KGw֏^7n?[xA˫RNWʆZ]T~u6Ə;ӓpDb81^Oнbި+P^ػM/PQrbV?ZU*Wty礜}/Ms΄@8ādvg0:3P΀jF@3{.3moRv"%`X͸*ִ5 JmW1J龋+D?zJ0o`eYy|ؠǿfI AKEgDy4;;xWI(q&%B%BG**1@JA8ɥra 5-Z2`5Õ-@F~( t \S X͉p9kҚтwqQ ى(45|7 h8 \\TTcjH)NyrQ41K"<518gڻ)ge0_ڥ{;^Z\$Xin.e".q}#=BT6!!`EڳR/GF l߽윲/!4}&iHІRFi}ΣlA'>cEQ j#(ɉ͆U9JڑMH(WH\4Jp!r|8(?O4h&*+5:'<7v5r4s5lj`f[?]a)aS1Km`K (at\`J\Md3BDN Hdxڢmڍd]q^ۑf4q \b2W7jr8P?V+m /)2_8h<{"NNS h* '}5HUa*¢<~BQsk5w JۻR>k:]K̕/_k]v/kPzfkRq@b 5Hoe ĔJpq 8[hWgRi6r:|&9Dg:EK^rL< spcp;g$- :qMQZeR\ Sqi Qz]Ԅv dz!- +8IhZ,+AeV$y$& |<5_ײiݝѵ# rEƍ.\ C^6CCzT!͇gR("Hs΢7",*HXTB8*$R2&2&K2 5X QNysUy}6CNG * bS,T9 VPߴ_18+tYep}駸4+0 E_V/s-׷7yE;~gMr(ZeG @EC\gÃ#eIGê3FCQh$R.yDǹq,qpgG, {!IiԮ<1 H4(EnxʝKxNjLAg'uvҜy{ WDR"i\FW !U3*kk I1ShDpvYc4 2#\g"G:QhJib<)2`d<"mz'z/|a曅d >JuH+7DR 8HIXI! eأ+c_ w4F!rn;KtNyavp]k)GR*9d(D"Xr#`ڢmə9%˙sK)Ã>4Pw -uN)d/גHb74/h|Qn~VNjh<ќxhBkzp <1*JSY ݰg[:υfB|A#AXV[8QȄQh MM;I9?3Xܿ-k-).D B#Y]49Rxrr!DR)P$$Z Lǹ :;1#y )R'J3^%)XHڀF+JrD)J.2(d pǒpOΑ<+2BZ('soVFH=Wt$ʞ܈wo; ?Ҵ#Ef5LyJD"׍HZM8Ւ$s&BLH2:[5ᵭ&Ċ50 >Sddv=ᢇCe=7B>nXw>Ξ%8FPN99>-^*48Bw~ʉa޵)2LCOk}Um X~Xx3.jKwɇ\`0 'rs5 $?S; _"]{s7*S']CU;Gq֮dUs[u9 O1E*$%[q_cIqHZJ=$"g@4n4ZҤcYKM͐f82id> ЀL]yos5wխ֝lj*ݤPV)ZHj4T8dØC|a+]LWnW*'~++ י];~O߾8o˷_D= |#0)4M¯"{pMꦩb4=# Zu5+rCG7\h=7[Dˏ uFQC]^BAI\|ٰÚ8sUKHr%yf7~ݜw|7d~ί6G܃ƎIMhYJPm-NePEay#\ARYGr!0${PTVjijт󻼯K- tI^Wċ[3HMQ0) h]vbG&nnudpo/*>p.iDo " [ʠ\䣂[Zb4E}p-JޫA09zJHe>gdj= 6<\f@O?0#-d|c++Z;8 . bfͯjʵV/VT,(x[ YG  s{P0ùaCv]^.ΦUňaC"&:ɸ9Vr).ZrG<> .C0!B [X1c2b=6MVG-0Z#gC\) 1k}V-':=V)n[EĖPƽuH ÌМ-[NZ#g夒˻Aj2L\f^ |RGjҤ9%8кC t;;LAw01b:%qIj=)g ``Ow!] Y1X>]c `V{1<@"kY"M]f7<,mEMF۴f-J˰l#4seS`q6_ nCˆ&gaM\?u=Ȳ_ r~7M}97  γem~MSgE]N^-[o)C͠rPӒmNlz$HB`[̑qW )A#)"㹉.rfHM;ѷU; 5'AvOd. t ",A[w oV#MJvSF,)̬rH# be}0z)#"b1h#2&.ݳ5r6Yu)i/bR[m xKegk/EL>w_J,PrRHhsJJxUIAta+!򮥅÷$W1yN-8VKChKddFxƹ%+/4".t ȷ KydcOL%'wAxEd32Dðqe9,% wHo],Ӕ^ɻ.Cuk+dXLQr#XѩUCWy 4}`aKu eQSO5\+C@!b;ӎ&g<_.$Cj<S3Cʳ 2r𯙦tɓq:tKp>xx3)VJ1 k J%GZ`t9ÐJs(DFNio`zZb kvSpH`hQ_zv=͉l /! ݮ^Tr1n)A_GK1ւ(pD[GW;ډ/w<ͻmxH<ѯC`B?],85/̜_|2riUqa4?o^a ^wۼ.X'L1"Ja0i7[5_츌?U:[UΤ9 *B1pJ8EKZN~?$4}t2MGf?0? 'C7>>OUt 'Gx_\L d-\v‹uO]>ޔ>-5]5!NRs?p4}c$ !5N<4>e8) oR&&miL Eu}ŐL d'go'˷vx}u Ֆ^ \X ҫy]US~9 P73 oٯN؛u{5y㢡7 ˧|'^_=1RPXRrlGs#]5xznZʧOKD3\ Z|"3dBԛywZtc:_潋¯&O7qJnT;6Xb{!DC* /A`LS}a` ζLBNκݮ 2_8B[iU% U-TE:4 F65 oUclREJr\osy]ɌhZYcjF9xnMuGb+uqD&hCTE*J FX$7D "uVV_PKGa-gi@|#A $uX1(9C'o`Dd0ԅYThyԒK|wJ4 ~2.yYP.(JBm{Rwn?&jz_M)7ޯ G)I?Y"^&t·{L): a?IFQ\<~Ԫ-S[܏J~e /frg)-~+G3d\nC>?.ҖnsH'RWfEVMm~i-ڵXK_~Eu~{wU'r۵R- ' LAp4ɝMagx4sóie'F DAK ffTL&:&}*Q+ 4},inYo@l)T#ޯq# 99J YSJιQKBgeG՛ޠnE:倽 %#A/RD^Il2(bh&Azx9 @ x|T'TwzT[ }\;.>ȹ(97i#1bR9cʯLnл )'19~=k?ZTqƤ@Q,k5qN)VT?^zPďL9b}ӻui ϻ:WkUV1ેVFylMXxv7?0֜n!I9\4m% 5WH8  OAs߁=68JdLeS$:'=wF9Jc&$;"!T/Q9i2qLq_(jX[y!RM L7 ) L%0&XrDioq\q-Vo޾M44>yoZWpP윖v7 iD#$^׆ &uD9hkcd|2WO7]ru[v]jGO,5f!GBZDq[GZ}x}Xɍ/wy/*~^Mn%1SGV3̑WJy%CYhbB IM XQ#C"8H$9E*Sɶ}"gB E|4R-3Jfދ][HzCH\Tfou=I[U[ZڐkEMELq ?KAJg2h= EQ-<@`KȊHaQaPTRZB})n`P@ `SNq飶:W12#XP1g&j[2FzViMwxGbtpتӨ˄˗oU,~OrMǠ'q(4[QXR .e&%i P)n.bo#aHʞQk %IlR!`V0/R(ԁi]Iղn~FwƝ*g"tTɹ:¹ l$8^ lcl7aBg 4d@   D?{ȍ/6/UCv6{r lr/ dx$ǒL~).J-p07[d5XYjehQA`Ġ:?A2:}ǮQ U#n6E)RBT:leG* $0WN-86HgX`lXU)؆I+J  EiW/Nis0-/~Q7H`0V3|5QA1фJN.+KPu/X,XKŽҎ}񇰛?|Hy~|t5 {F'=nײ(tfB0w"1Td;Ht u!޵=w&awLzC(Mit0}\#ХLHu~Psr Q ҖIAWQ!I%vc.K2)z . {W=נO݆5Wod6 ¼A(o\`(6FڈR[tFPCgS+<Ҫ/xXԶU ^8+ܰ\r׹:7^xsun΍X xi΍׹:7^x[5|E;hCun΍׹:7^xsun΍:7^xsun΍׹:7xsun#6^xs׹:7^xsu>G%Buni׹:7^xsun΍׹:^[gcy\ec<$țN9;"v1DՉG#cC_L6~5ʒ<7ĬB2@ZImm%]!9.@d;lTZc CN J" HS$%HFsTT(kehy\tr˸Lbs $bNd$,t@wº=ͮECc}:xa C6`ѻL)8aNˬO2t)9L}(X?'tux}1?Z#fݺ/Da B7 Kw@9X`P`sTqAJ}*}Qt sx1#4}zGї?%`ɛ.:2[0 v~j# 5j;3EID%EY:-2JJdH)D 졻[z6(Ql(49;'?Dvs|?hZS(rqEO:(28|G|WkfuzA0 t ?[= ?oB`Ѱ),mA$Q(ւBO9 ?_,-"z'g->^K|Nrԅ^3Bw;H"DJqVR0YF 1EK,jylLNDTkA-D1x C\ܟ D8R -Kdt`*$L-8dV kk_ =n:A0^+Y–ȒݟVWCwf'I7ߍ{gcul ؉w-sraˁ.0 vAKdt(tj͂n"`P(5BYI Ptv&/g\$ {d|Խ "WTLQWy<oJAT;Y *R L%Zb!JF<& 2^*B/ФaI@fJ&e< ᕲ*F*4`lQr}>!(?:䵩]U3&qB;-jumy9]fMlCZ`?_>PW mE (MVYTvxh6TԂ F72ѩDkc!oY`4d,ٱ R6B{u#۠xŤ-VֶS2YAA< ɊZ U@y岵^hC :=AΘ^lgv'`fNN6)*_~ [$PF| IQRp"i(*Cj&0 G/3ᨷo2` ﳈ芄OJ`VZEA % :el`Nj8ie]ԫOXl(Qh$[ gUl+QDH*K q 8jl08Ux41 H;}֬7Z#={uB,F#)* M ɛDYp+",e~EOS]l|TmXS<v4DxBdE""M`H j8dTfØ*`u}W:R8__o[euO $j2!%lJ6K61F`z+6' c6 ljOaS{ZbS'AcS[v-Ãyqi@r g\uт2NRMEN $|1]: L`"oP3eIiAhY(c.N1G H?$WM$N5B*%::b\IY?b^EH`lykܿy-Z W<>-ި"b"cVBDFVktBX&$ ]::J4wR}&&+kQ: 䝕ҲE2eZ8%^T{$<ڭeffuwXRH$ i bG`LTqf(4~Sך2SFYR"`B h79Nhi$%d$r!|3a i᭭f ,SgE/t_,kO >+4RZ+4Ky/h?~>+Kp$Guq!PGIMGa:ǣ錽҇|ŵΈ_u!Dc›}kW⦏lgg5gU|)OvRN0_E-{1Jl_.Z׻/8}8QǪ A[)he=ċz;SulS#ПFiqh~{ݦ̯|_jvha!0gr~vr:[ Y_:i<@z]O^ >F]Me>F)'Xr}}8mx8^5I uZFj%զy4)v,h2Jqx։7v6_)__7_׿or2dkEp?}P]sˮ- xŀoӯ{|P\ajn HF]8lJ2=׳&ԳUw=N faYs6Ѣ¢jzV4o9~wzC[vnqǺu'p\Ft;xywcXI)#UWLX_n~~_1Qq(> : ؅szazdrߥ|ޠ." ELyA{ts3AIH"7Pګ#`O5Jx-mډ-'ѓP `!\"R2k& P{ub0DWuwzHv;lo Ѿ@is|>'k7=[4J^+ "Fffiz9 obַ_\UVȍM'K"@hsӵ_?Tn0߫W?߽yFW<Ρq4 mݾM{뛟ɗ~\VYջgF'ݢ WixQ\+ m*W|.nٿ57M[(4}ȭa7yvˏJtAIUv (zA:z@ JATz #r[caEot,CbJϘh#`PKP>IapQfD$]Ȩ4c" gQ.y/>UU#sgjXr kaٌ} SSWz&L ūoJ>^]tNw:wuz󨒂qGOx4D;̕y{?߽;{w]ksƒ+(}^ACU\%qWum/)kcdHJH()G!3gz> _50IUγ-[WS1+N i8 /VvT kZ1iTDFë‚1|,nGzSF&+*S8$s]!E̐:Lg: ݆pPM < _DQ )1H dE>xpW oe8$ Ym6N~xP1}Y0L?=[PTxMg_ }ꥃUʻbngǸh dbUN Q!rwNI 74 %x`aR,|"X޷$0$`@\kJdPtZЖȌsF4W^h.%E\o6+ hǞ J^J8('%xEd32Dðqa`hL5H=Bq7_ҫ 0yw^Ex%v(y,V/R aOaJu eQSO5\+C Et Đ@1NCo_0 ߑZg PZ%Fzǔ, -1A{`F4KY iI< OӸ?A8~fh?LJAkf5C`t9ÐJsDFiN0X)}ԢFgMDQiˮgpvo}!KMxRꜶӿ8[`"H(TzI7Z)G40˽?6c`G;~[MeӡOޝ'*zΌô eR0+.ßF7AC  o.=~p0|`L~_ߊkC^Х#~ġףp2uYjBx$O1pSPIyiKt4P<`IލMx6og;s[zME#>O׿48>4H f_%_]?Koě/Zs XX^9 .Wo՛?={|TfCt,ӷw,ty<5 RWr:Bŭwrhp\(g1?q!DnE>)W~{5̚Lƽ8]XXw.z/~taƸrZ:n]2~X4;X0PnX.pȚt6FUmE"4Zw4)o)?)I(}sfx#=v]:Y0ޓMVt1T-79?4։ Q)s(E`* cALd(f7p֭48>3/ jh+<_ca.ԁbŢnj %y J!K#|7666 |lpn6 Z𖡔D]ON??gR>jY1 f )ĉh+G3d\nC>%.̎ND=.2"ߣȈ|?"#g3Qh!"F ! 41"(&_DwDŽ+/ҫ8G?'[Z}vo:%:GgB:sD*]59!`m"锶3`ri-YWbue~A yvvKg>*7P+ߋ1?600`pS$l!S` seWW3y{zpz a2$l:KY㰕'l߳?o%; 3~dQ,%PY1G/37_}+$<86Pu #m0:½UZ]˖ pF7 ̄ nu)/~U,gO;g`YGf6|gz9LJ=do^g96FYˠ7Y2r<+qV?E3o &Pnp AD Bfp|Th&J1XKʔbs>e(`U,N;&sɽ̙1\.:g-փ5]%kcڿsz:Hnzk łH͏4i ;ivs\y/vuYI67T`7*eHq̰TIe<>x"|88CfQtGhih-W |+J?Qfz K6M>AIeZ- :;Nϯ1p"Υk:Wp<_gxQK  v6Lߏf|iAۖ햮5jݫh3=[KO9>oIZ@`ExkZɶp$Ւ:bWEp?-\Qm+jpRr:zpT6UV=pJW j)W`Wik5p2JW -{Rt+"i5 H+ wJR2 +)ǬEpqHָE$i5w)\@RXSvî.~Mn&fcϙAJGJWB̀_ lzZDF˿zMVhǕ٩oԀexJ1ERlgK`eɧhC\R V/:m_Z(ڜ-rm|̓VcS0VI~zI^0~&~թfY^q+nqO|+Uu[Us^r# $g\Kr1<nN& s<Y̙C:jtNcʠe%wZqO9t@ʢ#,̨ë0?{-{u1"q:mFPt_q^F5@z 5MI`Z0 ۲0{K\@@]ϏJ1D{q>V\=NJgp;ڴ1UXUWUV}+#K+(i]Fk =4oiGU ,{U-`iTrBJVF%i-K$]ɴ%ӭjI\IՒ @F1Ojl W Y K֘IZ 3VW/XK[W aWI\Қ$-WIJ!;zp1TRJxʪzp%7Jx^,YZNȌ'iXBJMOoe6^ P? , +à z#(wR)|ktrL[w⌧zR쇳~<4.?Ͳw+)N$70NY/HH%7 :;'0Trfͭ$G~ZeKH"!ѺykBޚ2 m򢭒l]Sb`H)ÆJiu4[*mEP&".A6r 8M(rneZGEdQ0A[R.k%OƦv4& ;nefꨭҽޟk#%'iuzhKQXdX= v-nb}#0&&^gR[uYulh 6Dt-I)ͩRe2PyKoz]I'oC}#gzjJwf| *f`kp)BSϘ8P=t:F́Dz4‚SR:RT{)[5rUy",tɡ@y0>rʥ >lwo'o> _5W0 ɿТErXWS1+hZS}9 /uk(-ՈK=ϒ6 9ʡu{.oӆyXHW@On@ B,S?_lX6`v툲4Qy-r+9A2A#3;W gheu []bl璥 VEOVQZ 6Md4g⹎Q0\Y; F׆]RДT|1U఑ 7J^O]u3R v7`6i YTb㐔&O><jK;j|޳6q#Weo4U%9."DSbL IQ\טI#ehݍnѭ͢"d{,MXTx&6bbSHh%BcBREF\xҩBN()yaRJ@0TA`5BIK h2#@j@z_3tV & JwaA2ѵ+!: as?;\ Rɦ1Mo]ۃzy;(.9QcsE U NnՑh-p1pEuy&e ~#yϹqn3Gꀇ(HE d,N)16R8#(ݺ> J0rIX%"MTR#*#+H 6Pd4:C%|Wfu KmX;  .c֫gJSFK}^(_K?DT63|QM8P#)8G`% Â3ϊksVrE4$bw@C(q h)a2*B!v<* 4B wËYJpM5HwD@4:&(fEeJ1  $w8 ˏAឆQ(q&t<̺Ƭm_=F0p頤ń(:>P2m(9ӓg!\KS+WG&[Fzާ ĨFh'(IG Q,0Q0FN-w?YKs}'ƅMY[= Vwh0J<)-"ًЗ[3u6f)5 [ =0F]3s;ڝA*ΫS  > *4%_므]2:=IFE6Mbt~|fQPlW]W {H1J)fԮkU 5"Eצ g d۔Vq֧vVVO~?,y]UV[I^ d͟QNʱz9+;7@j/E.33Z 0^"Ԓ&-1~{KM͐f8ͬQg>ƅЀLhxUl|09Κ;GVN6W\բY-jI s S@_B6Yڢû8~&Е*&U &g@{߿!?o޽>y &ໃׯ`<h~y=MhZu4W7Mk,ՀiWvvoGY.,́[Qz|竑~0scx|UK1hj:HZP#@%1eq+VU\$Y͟rܹ/ +0i_t(]s4h#+/Ş4߸dpMZPaFF, H%H6r!0$9uz6g]}>w0b{]"5}NB9 6"KB [-[^  lHD'"VKL vʵ)*P4#mXE @<UX-zg՘1P"^ˈiDk45[n>w#vNKۧ_Cb|zUl4XAr6ӫ%d7Y=kC]/op5yBMf _-wW֠Ǔyѯ kHnDm6vC͠j h?wWHCjOf0@uq4 ůZۃ47 /+?2Հ!hWn޽[ ͏Gݦ2q ˜6OѶ6*YwvCts0e02F#H)"+[o&ۜ yȳL;HG.:!4}e,ˠuFnԘ]/ T' ԕ\=w90pFe٬|ys,O'؁YGZr;Afxo{b{5H)mFޑ*:MqQay\.,Y0Ja^iGjd9HysQdZc1VFK|0y-/滳-W0X?lR-h(pF%Hp G,[sui˹ha๗pQ󴽓3M9\ ;>g!CǙܗf䂠Vr[kg uR/G+{b_NI52\Q"1H|") `ӷ[ ؂>Fgy A4:*] R! 8%)+YoE*$j G6 C*mMV爯3*z[l6?Vf@y]o ŊثI֪OfսF_tQ}0+@A^p!F[夐E"z甔PTIAI$BmK oI0 bS`JdPtZЖȌsF4W^h.%E\6*,p`Ǟ J?<4J80'%bi1gP ƑCgl$" 5;)&QpC+dԸ@3Fj`CF ])>tߗO%y'$EM=Up KPF `/%ҁC8}x:?(zzB9(qɣ111< yAȘ=hF8KYP5Ӕ^t"yn0^{Yr|fR Z;0cA(K"kTo 2Nw\귈 f]W-xх␔RԵe7w#X_JndGRB<1pUʡ9c7;8I*LOüeQi'Ϋd&h\op7<08 ɯ3?:=.?>Fk^5!oSb.ӫI_g0 !5K*&V_NZ>*~1PCnjpw^3$o^!o0t^P;(7?#a9/[AB%;y5,QWW7+Z-9YX>;^$Jǒ_W_4 ڴO?&z9Xx]AŏӍ8D3\&Z|̮fgٵ@˚y巷𼼝: }tc:ŏ޽X |cQ}>#nz&҅_Mœ&OjF4O/-h]z S\>5{b 3qu(T!kwa{Zd:6tUQbՁUoǼ)?+ɤ塹jqovdxOHt[Cz4&D0 /0:\1&̕*:E]B3T -W5YyzqG⮎0[2y(eC[aԧI,xr㼵 SD\oEOGwU.-\@w^OVy-0!JIjA%7dK2JXvRR2RzLE*[rGg7!:vgJ8nCWٛDd#Fܘtv˹MWN:ztd9X-;?m-TGe Sm,M73aTQ( ,fچVdDzRz <",% tݪWU/򉣲Z@QR9(|^AW|C" @Sɭ{m1@GK4,H b0U3*;Coτ̞% +oy 6k(˹vJ=}E`'L>W3T*i9}EJ}/r =BI#:562kt|M>G[yTŷnm,^~7HXmɶ#D H>]>8} 1E&|WZ2a+ؐsM mXFfߡZO9Їse'Yo w,\&{qABf`eS Z2Ns%0uLJ>%_RRb.CDbt&s#'ɽП6A Z}Hi-:}2-e- >$/X![OҐ|28h\8\dIeFdO1y2#":\TͬXK͈͆`ia~Tv\Jjzr 1 e[nW2' $˰pw61g\RbĬ"C"x_Vg?lIgo #Ϥa6YCRI$*aF29YĬ1mqR p \LmI \HLZE@1D:_K"Zg5-/Be(;ŵH@+%2IԁPAADJ%, !e/ }ŃP!6.jȍOb QA[uяUl_яw3Ov}:&_j=' LcT4 D46!ky13T?X\7ލ=ğtCD.QXC$O\\Mg5Yh; HFk{<%庹J|ckpt6lg5!26d\*4n>?/_3}ȇvo`$2riGYVSZRtj8E+Fs< _2d\amV*#n\W?h٧I%Krg"_SE&r kJr6Hk+tCR& CwXhm5i:٢]۷-& Xg1K$Bɀ ~2N =fYI31ȉ[(I 6Ԉ"+ ̒'HjdvVM-?:NQC6& %6$̢K\:#D.BIh}(#.ȏ,J:EشB\7ۊ/\M2tR^kL돋{7HKgǟeo-{6Jdok]Ǔ/5Rje'nuu9oG3ţoOKN)Fc3Mo/~k~ۋwhaJBs >NV]Z!Ux1i޼; U=9'Pwt%nCb [A#XFLAr>>&kNl}\g%4.z>in#cEK^Y>G7&uۍ.=F5O;L/)_'߿oO?ޟ ;-M$$O}R{o~ !ݏе57ZZآ%_PgM#(v[y5 )?~ ~w6Rr/nǔ]YhYAA]b/οBblଉ;(ko9vwj<-[jy7ܱj]-IH˷%V%3,FfntZdaJtd 7^q$^ ӛCor(%d`JEhaImT9j:LbwyJ7z6:b'IH.; -*lH=16B3ބTc[$"]aVvqY/TWaS-d٭~zćrfސ h@E8,ayåX\#l 2A)ښ4MhJU`K)2D!8@@-\}ITseXM-,}M^۫>l͒>6nhz> othWBm<ϩ_KĠ1-|}rw3{}4Fl>v43gں}VW73ﵖ~ՇpwWϼyE 4'˭x5 ՜?mD'!kÌT=Ϫ6b:nX\YqUqeO`LA|!F_G; i_s«7LJ-o^D>|~+wQi9{}?q}'trOJw%8M.Z4$YoY,bߖz=%ů:cA{>(T}"o}]Bg^g˫.bX:Xk ۣ 3ow\{?M?j]VM=s[mvEͯh;`o c Eh.4tF)W&O1vo_S'8G=/{g?O%{w?(a^ ? 2k^veP7XA-aMwL WN=z"J::xO_[mvu*'a+MQD,b2OE_=uzjV!k>~=Cg 4! c  (Fd1Ő6&9m `(új YJkcxK&dgqcgNV~WFMOZz~IϨZzSȭ%T;m08ue!Xch#SZ%N:&#VN5S65i%0r`6%ڡ0%'Z٨2F2]N,- ""9,tQhpJWf rDe20TkHBH/&f:tb~+njt~FMGϥ~d4ሐҢW΂JIK^3^,^$y}Hkʾb9OcO>BOo+ُҴ ʦ{eipvvKj5fu巯C]\i/:ϗw{ zzN/ծct!]kR WtꇾV-|M]r/O'`oK6ލ{?l&Oդ-:[۲wwoO,H؜n;pM4UK:h# NFowƒ)?IV塿Pl~o{iDhEo4Ips-ֿ[ 3ya}l6:M^G 3(4iGU';$_^PWa^$[+͌K[ȑb%d3D2 su@zC>jv:I5۸WueeNryc3Vh(_ǓO8%V?walG1@5@hF:X[cE4%ƪl`>4Rn]b2[Zv\Q2!eDʈZ@|NBJ ^*Q`q,tGE.e>Ra9ܯz?a%{ճ^7s,ң0V#0eqYya< a|FKJށ:c]]!ێսU0WaDk^)2V;f| G Yc" §{mCɁ h@ӣ -@ "UUo^CM4, P<ӯ ;8 ,v˹*FoϿΙ(5mP7D'B/j0.FJ xcn 0s2_iHWzW_:jebQp̊eݜl{DLKX4~- 2n#q,O ]vh3ݠ|=݅;CO~:/:%'` ć"|r>Nkeؿ:Z)ҔE2$\@xَvXhKK8;:rBZZBӾٗf:Mo'ݞh~7h2ؗ;d{Y||%\ _^Muv4>Ό[UYL W_Ner$oJlâ6Z-BSJ+][zwֱܫ-_ w M\M&(J+QBy ˦x8͕}`R)fXܥ<bt&s#{'=9ڠЕ'TkΆXVB.ZO[ [nL`(XlY>JCYKt"2Q!&nJ#0s]H*ScDV|ɋCAbTZH՚R4vJcՏlEqVrWɎ/{$LǗm5SxVMN[Ӆk53i/y粊**O0˺Z/V:D@SfM>LdAi$تȨ1fDfIRcS%HLithsoR\KϹ j#cGz\Vb! 7f$fGK.?rt;v@OÓ#ORZ#:G!ԸRZU$:*X_2`QHQ`Sj4h#t ߎYG-t !e՚to;`kYǮ6Qg,Ȧuv99`4Md30 ^ctM]1@21CNIA&CX Z$aC)mME<Ƭ" T'}e<֜x: 㾈(*#GĕtB1EQ9{ci&  SJF܀LN-1,gL 8)emD.8-b_X +'-՚2D 'f%EW26RK4A#QDFr<Bf/J/  qq3X{Fho'k?Q"~?M}:8Nƨh27thhcеO{||.{JGš4ݵs%,nMrutlj2O h uvE=79$ : *p)2He]6ְ㻑䙛[}7Qz{2m< Sf?}]瓳i1]c/ip+1(FX+ (_nC c+WXD0JGsV* 5aq{i 3GYzŔƐ-R*)Y Y Xȝ6dhaa2ז<֜,6 Ђ$y|ƻ+ԡ'ώÏO|$S'>F =Q3&4veSoB@(~Gؐ *&q T,Ty=($۳%pa={m:B>i0 +e:O]hLh1t=t˚^G(X#ߔGKY7 wEI붛q f=ճQ8tHrr ,V D&6Jd@گTH$b;IvB(pDYVYF)6EÄX)+ʵ_oY@]=A 6i2c֎~3N%1ϯGO1bB.4tECߧ޽JǟF8Ž55A הY ̛R40Ϟ~sG'P􉛬;6neqӠBrhM8-pڜ7'Mtj[$)JV6dJrn Ķ3?p$jd΄QH3&ךX sΣ6r1d_O_\&W"*Û.Z>5Y[S@KF _-ICy#F*GsyXQke@sPG5NGǹvsAzzW樬A2znP <:Uر8=xkx6.!{oJbb,OYb$b&% 6<-a*6-aH]U:&-az%o }\pErIWVpu\J\WsHZ PGWRWhJDBBN+dTJ9ઇRRv(f wW$W%3vEjmq*Ʈ+-t.!\` \\S<" !4R"%+N2heCW=ĕf1vV5{6WE>^No1Vr6G#(lz!J=r ݼ䧇x_.+8U5g*85zjVgV6D ^7o9N/Ŀhk bm4z qZJh1(*srCD8Umjp=\ >d!Ú y6tuB1/N/Je栖5|3ewuFvQ(FX1rFQVx^е(MJ0P2B2)6)6TfH{bclJW$ɕ Hl6RiOWaӛ1S㪕`qN81ک4jRwli+3jߦ  H J溎+R)J0jghWE~ `~!t6֐gĭX+Ld;e_~@;:ϖ;׃U"^4im\j7ݷѮxc<^5 %, UG5Jl_҂W[dob]vG؆a4w NA^m=l o's@,)+PAnmNg@b?vTYˑL^S+UGuZ*b0%M"^_J.j=T5N*kMy9AS6Xۈ8{DG?ە+jmڱ\mig?4[Ygfϲ/iӜ+PFM&VCTb5R{TbjrHdpErNW=@"z+cHN'"6JjxI%W=ĕ2ؔ+lOW$$'(HpG\iRpK'$"猤"HVz+cJjkH'"J+R:?-U!\W8SHc /: \Z˺+TikSy\W +]BBdpErA+R:?T!z: ގO ĸj%WOdhN4vN쀫};ΙHW$dprtWRW=ĕp HK'q Hd]pK\ tBBJdpErHWVsu\JPz+9pɗi'WTpEjUq*ʀJ3 &v,KW$j' 4fUqeќ'+l҉H3 Ẏ+RĀrtJB(9\\̓AR " \WN3=yu}eXFNp:Q+:a+۬/4f:4hdX-WjN]܏y8(JQ\ بְΧؤR [N63>V8‰`SNʎ8Z ڷi9y- vL$+E*"tW쀫JB$+X2TpEjOumK+qHM'ة׹hT\S̀J :aG;&dƮPlĀJR2ɗn'WB*"vWR`q@4Ԏ2 PڇT pC\RpEu:"+TXI%z++P)]` W$W$3Nj{{qE*^I)DJOIKgK&$dTaJ3֤1ƪjo)؂<)Zէ|2Z;4Ѷ*emT5iz8XP HrZ:Hsz+4) +*HW Uj6DW}(˒JQ06\\ju\J z+,]I͙ tƮP-0u\Wĕ`O~k*\/l+RXq*r Fp-K u\J9]WF۴pE]:cW(2 H|2H*aUqe5FvwÎq;a{Y0~[m+s3l%-۾m k jsq:cGTCs&N;wM(&_`c$׊TRlRdSlTY_cHb;Tm٘ <qk^bYLʡ _}f1__]b+r%2ˏޜ]# sGJ> 9E,Zoߜe_~@#;3?Ux2}擪zOf {FU]QU_=7g>4XFtlq4`WҾϲClSsR <%\{UXTy=GEβңiQ=%M<\Nd8'^ V8&9 ^2kVh*_+6b]?f ٫~f㟑@MZimc0Ǵٕ62[mv<#|][qYfKf/B]Py{$ubm}bv1/iF3]|y!p MpnmhB-<MlFfjLkbI4hm@ Z?ި/u] ݮն'XYQ8 +_No.-*,P}Mǰv*?[9ȚՖ]8^ッ{&Bm]aJkt5_ҟ@?omW=oxy;cWΩrTk7u>^ZV1-jx)X0lu`gh@+ǫ{6B&X`$&u3I'$o^bu"񑀑|XG(x(cɃю{l?Myo xVmk;(+hm#ӷ8{?Aj[]W8j^pk_9k? -s{mzbhsX-AM`zdP$dU{YF#[47PU@7NXmwћ%u1\ФHM9_4Rd}Ę鵏4ı|DKa8fG7t|>|?PB}ViC۩=҃T:ۭPb06=PP6\\ԚT֖p\ ˡwE.z8>o8۲] oo{6[xAy(=@m1g%lCmo1|z۰H{N{SY'MN=8eZmNr̫Y6&洯ͤuыg<}n RJ-Qj$T*}@XT֦Zb5ZRJ-IIg$: W:+R݀JNlJ0 H8k֪ U 6Wq o'pnߪ;,_ e}єH>dR)^ڇ㎻w Xk~clx[q%9vwݻy+[z| lY̿GSm7O_cK>fO̲Cu(*/h+{j4~V'Q7sy=y+tF!&X:#i'7#^=*8*p=z/lBE/-mB2rm":wR#AU:pНo?.V~'CoC,F` *>0_0L(BD x`eiX}vlhu8^7BX 9ls+Ac1Rۥ^"` ;Jg9mcD rcy1hgQ8Sr( ;NBTUk9m ^aj-RhS*93(+1RH">2V:>h l! V،5cY Aeb\X Lh X4GcNgI9Ŏ" Acj ȄكHcoME"3|Wukb~'ur n!6K17)_JJ ufM0*Qk65~3X̤c/ pqT AP{ TtgB@Q"Ł.0͑v5^{Rl=:I 85WՕHދֈj{ 1gd0[Qзz $$dA"pPBiO @\D d`niXEjg>sAv ԙVB+~9EŌUDb֛䤡bLbb󢐴t8!bEfC|Mg@Wf״9fgՂUkW f=f/C&}6}`-$ >:%@uP< *}Yd :!JrIJ2Ba1%OpHv9k=/W(39՜ HDNW2COL4!+2] x~y4/! OݗUd 2inuG-=2p >V&s{eXtT? y E*(@6T H VE_ݿ ΝDiVۆnMEѽT}I(ΓD꓇FCoׄ bL=@9%jh{O*Vj7LJȀ-Cۚb樇 tyB1C[us"NJdt  J`Q Q%fdi SRXoYaKl+TO ="($'Mm2X{iP'7 B`~׋DyŰ aLw(ƈMnƢb${awg=u @ +IUE :ƀ6mӺ`fav+5iQc5A&E3| R5gҼLFALPha3BBvZuZ"TkBTt5э7[A[_%pg^ kJ V@P Vp*-]QZ4h>W&Ep4CS3AQ%#pkkJ’#iVA̯;^#@.* w6\4*f\"AC, c4&ʥMH28DԅK;)jI-, mT'3twE#BPޙ#BbAA(ԋB?yX&kW0;$IѡK'2~}ybli:m Te;wg')&kI^`7Yr} r~3|I~bjnK7mޝnB'_M;r3]V3y!e3)tl}v5MnF([+af{{9E{&ht9O|V4C[ɰcK 3GѫS[ϕ<7G*rw z4>kB{>kByhnq!tb'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v qLN d81"'*='ZwN @GRt1@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; tN 9pԣqnq,N B¡;gHAyv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N19P3'Z9'yN T±@$; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@piֿԔZ=,oۋo]ZIY\W0Z7&Npqh6.qgCL89"2҆ z,tJC+BiP2$]`GCW׏F]hV;eGCW׌?],eTLWGHW)oT pr4tEphMhܺtu f3ʶo|]L6G :F3*5.j_w*G%/ֹ3).wUkQGv9##9Zg3*ecwՕ ܗu 2 _dNNt@|v0 2R×CwHHvDWpj4;_NJg^^}DW,xx@pdzhNWr JS=ʖ>Yǧt[J3]}KT #+O7u=mԇNW@i$1ҕ "1+E6+kXB1]!]ѶbLtE ]\;uEh9t"^3]!]@q]ѵ+;kWևC+[ߘ,v#+hF= {=t"1bDtEx.\BWC+B1ҕ΍S+XwtE(a:B 2zwjYmpVH?M];&*QUg\hW5%qyѬЃ>} Jb fL%,QOrk&ٱ̡7B(kt0K؇Ota(;,2+tCOk]pf4tEp ]ZLWGHWJ`ˆG75 ]ZuPR~ڌpߢv]WW2]#]]QWCCLWHWVCX-0#]EWט=|"0]!]aŨt>]^:LWK K8>ųtug137-?aCqqMM.h:cRjWzr1I}~#3y;yDc?Z~?ogvk~C>l< HF~@U/߬|V bz'phE!+%YtU5弱I@@Vѵģ/cƨaN&ÿh~0UNnS :ˁ Q^&f< -pB"-1ev9 ֏ S?C /kZ7XI}Ǻ|!4@'zt#χ#:B?>g vzO3ߟ?ט_B/mb{ |׋:D·!nr-~'L?[ᄻ^O )PV׳ahGTۜgΜ=yяao^|p/Ng6u9y3]]L wz&}|,68Y:;.48-[W٢\v7Tt}:_ˋ;OochER᪮Ƥ(α0^ZLYc𺯛 `yƎi z:ǡz",]Rmypx~=}t\h[I%74h'+^)Qz:it}߮s/zl/̷2֬7O[3⶿uC>=\cd%?{0֭Zլ?ᅻwBY̮_ݎtk uciWM3>{u>[~uu}rjme=b4s|S;`{? 1Cz8 'G؝ќO3<>y>ZMЯZ=f3RoZX,^Mlw;iP@Ppw\`:tܝ|b‰.VٙTTZ8~PӲԣUۉ)J#Tv{=I:_ >Xn9o㔈E] o7+nюË`ћbEn<$c5],k$YJ!Erx;]OW*֕0,#DlaGL~8xׁqps!#(6^rc&P-ReN)SΪsf5ߙ@+ ٘c5/뭆VS|# a]#ϻ1b x}[%.1em;-1aF=oK:-A2[ Ǡ&2#<#ߝ NEۘ%W{O`[.9e JY-ZG]™h>l(Y2LjZ*q-kpkl8/ǃ^o<կ#{\8E(#|"f)_ө%f3KS BRKC. hr!,wsw|x?Px׎45v6".Epu{_?+`T9}#NTL0^yi$mIK,DT/ ILvʼn npub7YRDQl:WWmA,4?ӿ&Ui[iɔ.~5ۭNa"PةdU|)S&\4)IŊƮfeyiSړr;a)ʜGֹHr!8Tg[SwlcM҃uWϩXO{mdg"9? ir&su`hc/6D64[wnЎ$ *oa[E- Y[PQ&y2p,Xp-XP F&X!B_ݽMqw mݽ5{;ݽ'SԃEF;Q _|lUW^t*1;Jǥ {#KE{ݲ"E^E%i-Pǵ6D3q!oH .5N<9E}i%gy:pfjlچCzKsO9|`Gym Ծݴz|~Ǧ?rcI8+C)9Yz-H2oyPOƤ܍IӢνOY]ykupxckiOkN@ '4M*y {c" ʳUA cQiqO=8?cr^˭{Yq=!9NU,Q>>2#(8x ׄEp䴷~>.-ڀDVE$'0-‰d,7ktG2>-k`W-A|>nh_ #SR"l4b"SRJB)$e4")  hFR}D@ XJ35'<湂 QU]Md040S.9)k$#!*2h s!{sJlΜH@架||s=ROo.[LD.JdF UrXt[n}`ߏGՙ/.]wֹW\Ta+k&!WtT kF2X`yGE^5i`^Kk{u3ugqR47NTkC[ 1;uMq#bra̡D}FOiѫmv[BSpL_]X['_p<_.yoW_]//ex+\NF"Ei/C;UڴU57E sA t66P]d+فZ)篋ꆯ;OEﺟc.|D((OPWPlTk7 II+Dקm\f I[.ޭΈ;1.K w1Gnn-t+ZO=p&i6"WN LT;nɴHWA@"ÿX1fHZ8LزR?nF/vщ8t7Pis1DMLOd,q=B[N!LR *hjgX㱾c}V|_"NdB@ T>Rkkh>'I *FDXL 1Pcryo Dy %,wVz=S-cْs_>hpSA{SzDUcfze$ omuӷ(pv/7H,F#FUfj,[돪25r]:IZȵ {6V K͇ĠRxaUsνxpC<:~}oVbsѺ}N̛ xE 9׏|2osyֻyn(8ꆂ'Ώ~ ZP9Onk]],)oỳ.`8Xsۑ? 7D>7jY n ^B(HG]bCJ{e}0Ց?IM;8 h 2\=Y2r&MVq/#u (g4qED$&lƀxTjUzep QN8e%#1(Z&魉OCc ą,Z-$!srF> <Lj~S]{-8k")9'«TS ɥJ\z-}oॠ26%U*ƹ?x}8|$'/|j;k% $M^\"hPAVE2߷C>^ >Vs_.o*w )`"jPIrt#kjTrtԀKh>r9B ؊^kcw߻{.XlBGDC+NFE̋e楙@Sꚕ7E|DVc YD,ϟO^q®uop=? p<+apjPGdp,R8VzV-ęQhɳ=r`Au>ȋˈ1mNfMy Yw[Fclu-Vsd~T}yxu瓔o:i5M-Url<+(^]Q./̻5~8-p47`&*N:az\ Qۈ9!)d֖8gCc[$&GL}G@ќ3Ry(T(!*HbHl'ᨨdb>j%%f19 B|GmMs|C- z|Գ3% *v @wʬrG!C6 B VnEӺUWmOVy XLQCADX!I$A(rhLEEW21'c$d[ЅCW+9$qJ6:F 0b]bhGR`W#.œ24r 1UAX8yjDYzq"E֋È4W6I#eq!F[s ʨPlj2 O3GE =Op .5}[jtk7O l>zyF)ùW2)6JLYpWh#2h.R)V b"Fsi8 ΰg uġH9Kz%@ꞁ2gXx/4Zc据V=\eIw3CəlfvKo>*4 J8g#|8wǚB‘ blr1LW,AsUbUIP1+׻M:z zN MgzmdۭTV/v›67 y- v&g "kM'hjCL$]TwItOZoTȬn%ڵK+kGqprDdj@wqWFh0vybr0X5gUCߌ?GFϣwUw۩|+=-1UOt-S/òTj5N16y6J{JHJI12?R#4W' z}V٪{][f0N˹A2 "I&#.8bKg NECMqm\:m2C(eFswbֵ 6PI|٧7hV PEYWRdL$J -LNe} BC{"t﬑9E,I8cDՐ1Bh48ҤH7TlBꬱyH grw_l>_ vO)/urJiF KVT/11"ghE<:K4B4)xklÿۡjKaGg@w̿:xxO~v8{:{-<v?˄e[]{]Kaޜx>n/_ #pw ]7Wa㍺x]ϼF,lvNGsYsw}T:jK:Y}]қnοׂziX_{]aԟYF7x_2'5>߰ ڮnCLuxS$r)]nRUqX{wڡ֕ˁY8z1Wd=lT=+\L zOZӭZxձhm\LίZU>mKDr&.ΊԶZ/>k.mf^1{O1{nԙ K|]X0s'J+ۮt|Gqbw"6˴L$NJFr|:BQ $acЯXYx ~6 ls) _Jc (9ՐٕV \^$e+{,>Ull_t9dkUDŏY5doZΣO@q:44*Ǚ$%LhD#wHmnSΝ|x)pE y͒*$)jϹxRNc4epӓ"1qmjٶ k<盕MʹAːW{fcni>\*6͉-Ck\RBށ9.1'+&x]XR;mo<} -!b|; $Rq]ؐ 6To< w WO7!eS)槼_tX4NXɶgZĴ(G8x9͟ʆfbxyv>VjLd^[ 3r uВd{&ԩڪ/WK&emS^S%곳)/ek8u;HREe$<1UvNhc3DQcp@ :g(@,J@pO*db0U۽}Hdz}:qbB3`j>G\HtBBIrgX 4x%|Ƹ(ÈJԀd| ;ÒSy#8TŸ@sї*-źA%:$ -o'UOzo6 ,kV,:(:&bec<6TR5D{_?EQyӐf"Ȁ&%q2qf!(5U+(2JR9I@cgSE^{-THơvƠSbTrKJ#cGr\mgdΣf`nN]N~6~M#6@NR #\!u6%pMʱs9[0*$:'׶l~pokƞ pg6T^3hm&D҅Xw#‚y0 t j{{ʴL&y7)Ert #wDy1B1B7ީKIHeӂZ )&zbMK9[;b.$G:*k aڨ\20 "mFD#b Ę c9xݦd"x+,xx"bR# /5hJTqIRQquJ_-iHPϜˆX; 8qqe"9bR-.Ba\=.x/Jť3>3K I'mD!6I^g'_d.)P}Rq(xe>=@ؚZMʰz\$p|FQ;=n{?>6 /gɻԎP)ؼZZv0Q\ ٍTW/q<>͆W={~x4c(C>ԊsHy˃B=ꀠiBlr1L=WQk)i#o{4?Sթomum!CM`NBpkmsD|UsA3韴gNkbPmDwv-fv9M= ^}3h"'/'6 ! m=/d+/?#]̻'i}+xȿ_`ց*% T9!}=sȨNIsRlDzJ, 8 Q*5"V4Ȍ~K |EaH!UAUv]dC9Ń_U^Fv l>m߳{ßGX1GIȃ2nq997y"e'nVr~;Fiݺ7?[F~{۩npeg$5 iJ:Y<:I \u 1Kg?GTjmm㹻ʶr zl^uqxB;1 05Y# Y(R;쿃Zأ<,RY_!*|o=9>xy0pv(Z~+n=6W|vh/) rHjdɵd݋Ò-q,'Ab[3ߐhIƌyvu(/ОG]=\nW]=ZɟG]=J`#ԕնKϥԜ*:uUUp(Pk^]*wtŨ+Ai ] i6DPQwuET!;uՕ= uE;v8JE]}WWD2SW/P]ᠫBT *TթtvU\ġBP v3ة/4neE9;ջ–&`8Їc54nr:+]{`~4*+</5˚]ty"KA`uFuYz~b|,X['h0ނo룄5GHȮ8 ֣QH"Zڀ"|A3S44kXvo/TQlnht1]缼'z6WGNܣ>ߏW>y&o[ΤM_>9w Hh }E Rhއ \ J-(FgszTl;wTK5;箸u{r=;|t_У: ͥyǏ?m5ʙ5M]K3Y7[ ND^y\s 5YO: q5wb+쏪 <5d"ZR(.d\+*ܤDꐉ [Dٗ[Ҳ>-v<&f2l);H7ihT<ͣ%|I&@G2dM:3c8}5⮃֡.kS,zd,#.%tRsAL:v0wZiƥPpݸìψh8)td$Ȍ4%SoUPV2K,VgH9ڊq".u UdLAip= B;. dB9tl3Q=m9lsBqn plik85G?>4p;-/*MMM'ǻtpT*<ӤT9 MqPp:RZrN"IP\jpNrRV$"2::mMrY& Dg1K$%Hmh0Nܧ$4w9fV,{iqb-PDM8&5 Z$we9krV՟_60*=DrZQIEtG\*-̃eQ*X9)9ہ~fi#gfŠd#h 7dҡȐ2=硃ZA-tvRg'ͅwoTx)li h2*MPЮƠI`{c6B}8ZQ{uO?lI$ (¤$ ɍey&P` >:uމv}˧z/l+|ey5A:;Ȥhv@ J,sXdEἔlUx^wюxX}8d[-K}㲱puCJURM_j Q|f o3ўjl&%~!lCTU *m9W;@R;"]u4>mXE8H‘LdV2g!RfR V\ٖt@PƔL. M`孷AJtLB9Cz6:oXs>b@xdͲ,i}վ9-2  IX`H2 @ƘftL$x>pp=&ũ=f,IdF0V ![Y!c$"y)iA +5CyEjńGU9Fh+LD񴙨vJKD2A fmM9I$*@߱qښb#T|+5WFFߡYhd1#UD&#$O@\'H 6g6 oƀk&?1#U25'hurxf\fl̀P'|iVgͫr(292{Re2ϱ2L:e pьd[uW% %t^N0.8` xt7+JiWY$/HrJK'16!M2>N+8eMëI%|v~bs}tv1.|-Y"=i"W'WKE"+bMRĜ\ -r|hkRg-34}7yя/'?:~f- xω9l2 뫺3?O _^דk{u݈(av,U/V4e:nf=޿08'^g\뚱l罎Ku4ұ"u=NQ#~{Iota(4NqF_|ww?-?o߾;žǛw?|K_i>^Kݙ_v"@7?=kV߬kiA QO'I!(v̕ )toG^)@&^ Mb~> {8mb%Um܈zfnߝu&qg .r%!fBFnX ie6XR }P,Fg29rw;t|n#FNCgЙKO W7!}/f|M!h<[խ.?o<ޚ2ϫB샇u4gpGWuk̡)IcAou tiB ]t4:@ߍY78;n{-߾cohC{On܄GP6EWh8t>KJӒ%x?J,>.}0XzA5gß@tsw6sOqt5^hlBMe\Eo1)OvKﭥîK=B_:lsi5\&v7'ڠ;ót"+TXy Ә3ҟHm;#].Hk @zv4Y3΅Lͤ Nˠ P٤Yj.{"6C r<w!2=x`)j[6[#a#}@J ܹZxb+^NP:^ۗw=tLml L]05\]1^BxNU3x~T(#j'r}֜v!ObҪ>?eMTɯ ]4cUIfrFA +EX;|_7abM҆&hLs d8!qhZΑA1Yd@w(#f |&& kc#e4}^V=Ѥp6vs9JI,m@$ڠXL,ELr<.[+[?+J<"}9ݲ8L>i˜jً~2-U m}|i\)Gy᏶Ñe2Y5K&Z.X48u1X' Sr4“hVS /I10esN+ ,# D]&dD+SHˉv$r`RfG[\#u* Z,p2RdܣhY1FN[[eieI? Y\jAo<廞D9`à֌g*9.K~aœiKň ; ^9 *e2[$/zHX6%!H؝Z,vc4g i@Qٛ`S6Bb,EJ' !I 8DyZ<{p~SA3kLq&9dς,`0U:g f@ "eg'{zehFf6CC(PԶe8 ŠIYƷI$IQN)M4EC|RL&*2#*hNqh![`BS%\ͩ)ř_Y`K ~Noơ7;uc, !UsLSizu4 lj; ڝodE(h+dhw7_L߽Lxs~ve0}W@Uh/K-,epS7A;H77 ͏IgA:{xE` Mݛ>isI^FĒFoE;:)Y\4R|+Øh+onG@8;jQ2+:ot2]oZez%l/%{Vi =V!_hEUP@"ų?`śܔ yQJpTf%d$ZXd6/lkv?q|j&vjIM {3ItL3Smv.'DWT(u䊱°.G#eKWVnزì?M3\9eȊ}Mԏ7` ]*y8:'ʪO> gR R;M68*>e|Dz +,S|7Y:8ob/ݫ[A6 0k8<Z_gj]lmE=1o7y**5IRs`nkbi7=}MBn黆k5pO/<q.V6v/#(d#ܼ `YL LjNrֶ!&H ̀]!CvY$P"TFpSR>!ΛPNltQYmolk6CۘPfK뙐^<;G kQnCVR& 1DD&p TQ͍)tm\:vF6x%y|dmS֞:@۹0OP%k݅YOK8-4S8213irviiCDާ=~0velӎ':pgX4)՞(`0!.SLyN0EA2Z-kFI*N/ x-yA)ıH1WOp˱rZ:MHEMT $"b,%(|Kh\:G@w'N82N{TSE6Y vZS 8E!]v,fW}oE25Ѹ9Y1Mً #?4?=0p< G𧟲?ZhQ(;:oeMYhw7s ^V",Y^,۞y_pAŊPiuU 7Nn% X EۏKBظb#[V)Y) *M2`!R:5 @EƲIB }V&!W脢"Ex"#>+އ'e+ 0O Z{MG TI 1^ rgyկ u\2:|(>,J0l%- 4˖$ZZW|B{E`-N E1: '(¥ܜSg.O@_5.`,'Lo*cUző o@m$) rD8@=@aPbdFI__4ٛ{f2T<=  @O]r$D$%n C5L^w:.'y}L' sҬ`F&>6i*W{OgӾ8 y~و^ eY=J{sJ>/g.2򚕣E/7P/+\ק7^ݰ1 Mob*Z#Ie&Ӕ#a3S&/Vp !k"ap) IxjZY |)%kNciM- P DL7Gzt֪V5,-!1$G!FRs*A<ȽtLE :P80+l*$h4Vq$\@&MeU4dUD&F[d4M>\`7NÒryU*F^B>9&94IŹgacǵi0$Pm4^/)rQ8$ͱ ywF~\{\ !b+B[ޚw;۽GzjzO LO ܾ^Bikk1^|}V½Ydeu9U)\w56寯*x5)c Up ]׳Dӓ03SoB^rxy*6w-ȉR4gKro +e|Hu$XՑcWGV)jp/#RJRZMu4>g G\XlJsL(QmEm*Y#"( -J)'RHDcVZg3]BHwlx^fwլEYx{v WOT-r"!' !7A1fKMC;r-c/{q1D!bҹVwXTr*WRi疞 \(wp)BS@1b +j$_QY_㾂)u^f?.E:Y!f:9(u[aObA=Q :eQٻFn%WuRd= N]HNdId'A)ZcSi[l],~U]U&h%y2I덪qf` ;- Dg9όc9Jdax`kSJ ̊Ř $Dz_IfEBn67mw6:2//8wy~ >zL)'d>xJ[Lr=.8~: 3)A%\^`lM Y1𳇟=|muӅ$(AB3 sE8dD Z+JN?o79/Ql+L_sJy .)/!MKN,C2B f6ʈT&& .)tIg.PBѕL[RmTzJ2eFR ly}GnS F> Xil T㳈 fR=t6MoyX?/2}=|( &a,5A  1"f ]u:τ6 O 옋1< u ov?JL~;k?r,|{_!\djWFW9r9DDk ]p*VIÔ\܏D(|E/zt) ;D0̳$TK4JpJYR)cH.'~c`RfGJn$ḋ4P8ЊWf"{k;V#N$螷{SjJHOw/t!j7SDWn-5<8iO-#jCd6x,Im$5Ųa. EMr/x4c%Ӆ4 KRʨMDC)E!1"!㓋V 8D/yH= ߟ׋/g/op0FskMrI oY`49K̀7)"e[}XWņ5{_Q80E5@jX+j񲀻F_vAKW.BI-IjwXIBG"y2hQhY^dҁO1h)z֋) $IQN)M4EC * KA@*3rV[\_߳0+ Kr7~89+~=+i ZstԽ?߿óQM>}ݽ͢ao􊙰k{{H'עmFR^7Pj9 ϻdLl8Nﯣ& Z?_̦ǿ|Εl|>\gc~\>ʣI?A95G;~2:"2:p5[aNTjύ/&L- ׽9{#y(zI:tßsX־e)ͮQq~XԕV~:ÏNRvJn~|G:>{⍷.Ða M۹޵\Nφ]DS3L$I ꖦ⦳BWuMہ#Y8?p\}ﺑӴ; GeBsǒ/>/q2G'eZhW^/n͏_w*qF8RlV$#/z[Ra4,{m]m|㔹-8.os^ sZebe{=bW^~6KLNQε|%pWiݞ^^61eOtm{9* Z˝%&!Z9/SnKwZ[(&Gtp^䎎g ٶ3gD`~$H8O}?[)Y??-ra}1]-|ռx#^}va$biOc=I]8lg g\BQg\jwj6st[g{&&6LPj7{-[]EVMlbm9[ZW.wھޜPsz7s }7}H$5%3H20N``4B҉ f z (4#. Y遹qYɗ?7 xӧ(u%vqqIr 9rPDpt(dĔd.N.^@'MkZzf{+Wvq_έ|,HXz9(J%ŗ ϔVO6liFJ`\iylBJ::feFl`{JiaPRZ(y8٦DKm0G:sFr!%^C9PT~# Q e#dx\>lJ<0~]fg3q0)O%ʚ^d) EǍfx.R@&YW`k$Qܕk 7dpt\7zv]IwL[6IwO4RfNf5XKUȍ6Lj}4(z:4rp !=J{μV%$ %7dK2") JIɈM0ano٥{$#щGT\SL`˱G* u ZE1E^]W|{tL#KylsuPdV I6iG&H͙0t=8sa ZЊhBbVoQJd1#"BH[Ҭ#zUzLVy)F8ߚ6n?5>=:D/>XLt_"D^ONX/Yz{% M4Ұ@ߴ0حଠ-g goJ=C-Y3:5,Aq5RЀwq>Y# +7ZF3\}yþٮvxE vHyVjj&综a֓hڄpUju*R gEkmZ`=FkCs }8u],}K1 Դ7 b\iePv̰l DKidΤS-EK)2D!8,Fg29r; ilF=÷CW<u> ۈ^ [2=|H^& A$JCYDdĬ"CRM"!:qC"T\܊H\ 8:*:h1#+!dXـ\܇\<X;E>RD؎b ">ĸs7Np3y?J0noޏJp{% ƨh27Ail$_ ڻ=>}>GƝpڿ}R{k2m< 㢺 s>" "J\JH meXfMc1 ͳ*΢-I 17#b:iDAP O7%}};͛d]'c҂tu$/x [L e )qvQ2P6H4f4XlKk칀~9P|ݖB4D`61##M"O(&b3&UfYK!Eh(}Hlii`V1bmȉ,0% )[&{bS"RvjF1 !䃲1Xr$0Vd8*g]8L^$7>:by2AY#C#:; "nMȯGWqq7o*{[vݽ3 i1==2`[ыFy@kaPD2up!ǤϕG>wzl}J#JG (1}X6A+'dڍĬ=J dBɹ2p!k!Yk1 F&sBZg7; PA4rZȖdcWwB wy!cA˧qH`<%YjNmUg>k @\2W\*D0vhr_^}*v.(gJ+e:XyEl:gDF qs, kO׸/ tӛ}ҭ4, O{1CI<2S{xԣْ- )σe%62YU :hk%y2[q_g.;# 2Dg9όc9Jdax`kSb̊Ř 4t ~, ]ߚiɁxܸ.~{i)y_H> @L!sr1;.ΡXW/6&;~(ddQQE4/8Od%gQ ˉ>l~>Y-/ݎ=|> qMgY.g֍0k=9Fq|1P=Gy#!HHk9zxbp)g#(% ѲtRAvM;trkf#DwG+Amlw#BҲ$Eԙ09!# QG#[!ٮPӋ"5>Ӳ͏buviAL)d+.db*Yj8/)[U vўy,X˛C4a[5.^cE?(WDEH=+%GiD9|2?6y&i&"m% Sm 4Жc9uwWxWJ9Br'ɫ $ɴAf%Cox&B!e!,mkk+Rם]m]CXox-j|?ok_KV$,0V,sHdR cL3RpL nR&(N%3l,: RfMՔx$SR 0w8;>خYhb2F7}ߪ#gJ "(%k<9NiHB`֔DOw-!#TVj~CtB#U$_"LZK'uԮ!=d& y$3Z,\sY'g953ag@('Ҭg߾)"E:k ˜I T󥝞2LepҌdpRaweAjLj%K;34F\bϏ" nz J4p,y$; KA16m7ސtK44>YO#S|5L馺eFjn7o'פRjeWt͔HfU&s۔=niV/~Y?ɏ̓'/ޭ5”S a.0gӱn9++7jß'Mpw~A% %'6j}KQllfUY~I*ᨙ4h(:n=^ٿOtN y'7MV%PHiX0R8U\%ch|o+]J5L++_:/߿Ǐߗ {ޝwh& K"GG k~m[MS{󦥅=^ Zli״ 7Ŗoa}\ƀ<~0?f,K61eh5 *0Vq_Hl2󋆾-UV*Ό{nI,p9M͖j~ش.}ٍ+4ʑnK|ON(yye@02sc"%n)ӑf(O"d#JϽ0ba6YFZCN67rPJJ+ze&dX@!xu;lt 9W^Nasn ˜a"HOz@?~?D8:dj"..EߢoqJ~̼Fd8>u&㸳 cr%!fBFnXEbŴ2,w)\>e3; |n#&Ξg:7 Նfd{ouy}f`4~G$FI*p l|C.\h,agnvGs/wգISsp~~_!6Ԁ R{;so%,、z5tR^= pE}fS?ݾz>:?64|Gų ˜6Oy{[*9 ۃA/3N7FkUq)o ?U8UYR׵Gk0L*yL[g?_7Ŷ9>߿YM՜7ů=H=1*o?ء7Yoq3} e֭Sz0+y'Sw7{<=L8y\.Y+[! Z6K$l2Uk5w(2ƴb49/ fpc 8N0W:djNAX+ɓ)zDTL:C5Cp+,G, VYRB{m`V,ƔUER*YV*lu :DW9kɵvt3h~&Ž5ը ͼ(/:~vP'# *I} 7YV+Lc"yqVd,Ѱ-02, 1,q\8-<$V&n#Ȭgk!ri>^OXvٚg3iV1j|6)d-?3n}(#FB*ЬuABD4z&D˂Inft\} 8yow9d٢9ynHjZzR+soY_'- g_{埰SLY>t\Jx( zZɖʘRdಬ-܄+|)3C˘;Ud ACtpd,Fb 0 h*,ΐiD|Ԟ8qY)YJRlCDʧ(x@-qZ(E1nRə-1ìse$%}H,O%תwDiL[WMU> ~tVma*BUcuvUݟoG~9pVr\=8̿yp]A5:OiX򹟎8_-xѵ}T$9:^ϾKVPzy?-6>O&e/>nxZ'Alwmi}^vH݄~#ŵew<6 } %%òNB׍5Q5{V%>"p٣O%)A :lT³UwqSjt+z HZT%A>շ|u^5_y7YKaqr^|wz ,a&}>9˕[Z6hNo .}cn ΅d&кʴP 1stA GKlUNF 3؁ђ-]ӥ(zE{ZHДř䦡eyOYOb&f0 &c̬gJh)\ƲU,1=h+|Oj 7zk*9)X YeYTB3)(3n6J:Tگ=~怊0A8p1 iA::r%SuHK$y !_1QD4-U"] D 8QL cEg3NIME̲{GހeGiLUFn:`EbmXHؒ58(S-~L)W]گԱI`R9\qTκ8Ho|t5 FFt:v@:v&9Ok_ޞq7yw;$ރx F{q ՞^|YlἯTNZYg>eh ]R<;]>ޥ{ݻg+KbJ%Kc>}|, AsigLӊ?UqkkvvwG;kouК׍+M7<i0M󫕓$9m1 ީX%2Wu]-b[17c҇fƴls5I>lؚG0Z0Fr 2+zó΁#CL1C pYN )@PNdݝV{wZ)躣=Cds9-2  IX`h29H&41hҕ|.-625i~31rF[ Dd%dD嬵SZ"JA fm F= wIw|h7 3*S4WFFP,]4Hi/)Ι4O1]EM[A1gI>-iOkLr Zf7%Tffljݳr(2nH<L vaRMɒh,8㒉q푧f[͒'}Aғ=2/IA16m ӯjRjeLo*uݪZL ]?h>k ,Mc4uҧijRѯϓ'//g/Nah9ydl[M1j4==sh64ѽ%P[֌ڌfk3˓UW1͢A#XFl@m.zSsnumne:*Gi#aEejkF9>ǼL^RW+4'~TOWt<_p?^~ypaߜͫ/h8',A`-@=Xk?ݢiVެiiaւz=}5mvyK-7 Q )_W?]}|я&vba x\ǔ[Ů9Lq?!I/% Ul3^7b9i ,Xn%98i.7[y9?:4e8RwTΧnԫef3V~ʓա:nD\Xs֓[Oΰ j&kԖX{ 5 LՑ I:#H<pzv41rB&fePH#Nf*Iw h'"WtAH <YkKQ2IoMIt]>\w0bZϵ[ǻj,\q|+H9?&L`j 0u>϶UH$91^}"Xf>J kjڿ-GkB el2ʵYךyehmw{go'ǝd*%_`ѨxG@*Q5NF͜ut $c~qx9[8t."# 2mhƬ :# 5:!ݧ ZB.'x(A6'>+gl+6ݑ!]fI톱l%Fedq50:dkT%CpAvj|è$ב7QDrt$W1+!ZS,ɖLZJGѮ? 5\EiDJ,ǘˆsHLV Ƕp&̀Ġ\+yg{q­yM.~). (N"I \dJ삜s]cE,1%*Q润EI/O;=;YI$V9mXCr@-u*{Bsc&[3ϲFHډbm ǤFYY`Vd!Ig&Ξzvxn~~;&*=>Gu^(y%f]yBPy,Je"3S6U@Ӆzuz}fm8p#K&4eA֠;GLv2@A?lmbE,q.-qùc1oYP*q XErz'306 6?!J,[pu,94)& sHM0+ٝMWssy14PJ1WS*S R{x WbO:M>T5*?PU~pBUa*,8"bz!dFybYA#=&\!\}اcﴼ Yd{xq*[.5|[ jXsp aLDYuOB̨c\,%sH[]F Cʠv2Ǩ~wՊ8#2A@@JF/dH 0w#rSͮ'Dd˸c˒yAf+ᢃT:pl&ߟqwI7wbӅUy7[fNOa͛MCm)6Zx2$d)hic4ң_.~JExi"8% ӷYl]Ra_̵'7g5suTF,RcNYA%z|.R!ʠqITi{$Poip`?ί/]=S/o>?O29k7hR(B:Znh`%x`aR,|G#$^>L7b*F NDEU!hm8W`DsRRąy;fq,;$ H3'S'^Uguwb}=J5 <7GV<T睐Z Pe ׊l`eBQ"1$Pa{Wiт ;B ɓDcurQaH/{\㢾ܽuj_ևqs.nnW/+fnߚ:9}_<~ݍ"zfZU柣$`D˫>䊹}߭ \n&,M_3[[MGT \~: Wdhu:kxp>? Aah" u>xV^4iIN[IŊ*MguYx|v!>eh3it1ȟH `-\o"؃9Q״!m/Giq3cOi px|N\_AWG3Тśk^\ ::Eq޾W/fm;$A}@ꦵ`D"ϰF/nMW>;XPSlnެ͍{z~K5c9 */Պu8Vҗ;]{.T3 D9jy5' ؖ U20JK)FD¯wH1CtV|h;,pS, uDRQR㘥)K +:b8Du;ЉdUK=^r٩Zm-4H"{Ĝo6 L: VaRxq{<ۢAviGSNjg<7 ޯb2@݁o{OJs1&{ȟy'#͞z4Ϯg\]~m.(y8N+ˆ7 1KPߐqZM"VtXAi᠚~0N"QTi \# 6%Fiй$:+S&ovso#iHƞQk %ImR!`V0/Rz"*֝ێG~<;Ek^k>g"tTɹ:¹ l$8lrIT{1vV_\H☁ gA9AQ.{Ҙ0[wڨ\b<?Ո$F$F5~"̒nb4R"iF%RA9rL;Cߣ@#4͛Y+ l: ĤgNz& ̥5bl7|be"9l\^d"bE!:f5u* ]G ѐI}ŃP!NAIդ ȵ7nNpx?jā x_n2,%drJ_ ڵw{ܓ#;qG[,RTYtA:{ }uއC LI(spVPfl9ke-e`HyM>jb w MX K'($Ϝ컁%Ɩ}7aȄ!XPy21dBɕeNF]4bR%c6(SZ8.HIo:U1i=,PAS:ep`EQ{S饕^ys-N9`WWh8)SjjLMh#q#`~0Ɵ`)L%PQcn ;k+ ^X>9*{7| o:-j K&km[-cw7ezC)A[/*ǴiQxCMӮQ+xg wU7`}:jPةV!] ܒUAmh]nQkj+*N'535 )fS?` [aXoUϟ )*^`Q}\nԜ.:yNmIrZ+jGF6ʹF5DFgbG!"`I)}Hއy$M$ fo'X-U:Ql]#QߗG L%\| Lw갢l~;aP})zݺ;W-rwmI*%HFdqX جaM"ۻwIIg+.'(r4鮮zj~38k^YީM6LB.( Rhk}o6s=бtluc o ]xu`vN|et.j뀂XvX$4~kSSw:vzÇmMGݛ|>zw q25kvr).=bc';5,lSMk ji$)hf~NCj=Is;/kۼɆz/~=W:Yڡug h\rn6ㆬh-?1B(X$;+o2?OA@xus$3S1/v-1{n~xKr%KAJ]-8YDnWFXxpăTȹ|f6>S7wwGg%Qxjc,:[\'<&{U\v+ˢZl۝ԣ x,FM׃hh-ɲcyHC; - 1f ,jY6!=al4Ҝ<؟7Zm`SfJkxȍbh^u֏phOKZN9 ilmbVeQu\%U^Baz5 גp=|.X̪='FoƬ:1ZsĠJ8_S i=„;<ÙWG}y]!lG; ]4JHx&1VL" ʿ!a5T0XUZ\5jltb~\=Ҽ^@laJ9W 'r~:\=d %&-(cŨ|2Euڤl(Qx[U[VbUFBVk8 RI qDČ6Z/b&Qc `ґW"9o6dyE0EqE&H+/njuZ)4Vو T/Pt@[ Xk֩搦 mgdWN 噴UA @W&ė9ĐGSU9*ޤ8f, Y'SxFcHJODXA;;ㄸ[NUgd x*WXhJL &L( 8l)A@:9g(!./5ځ0g ˕3Ve @2)CebAE5\ɂb,*&aE^Y`(#܁?Ubl('R% fcI;A.IGY ̢jTRpTf)4ͫFiPrVv-=+먱^!MZH̤A@ f|t,2 c%`&k1Ҙ{yڭ0:f}yL?\]͖U&A7!vfWH7]pD3 L[Mn;`߁f18j]0!v58EMyHY5Xh4v Ƹg? 2 |殝AExS!9ܞc.(7|{`6Fm夋Y.eBAt %0(H&P#F' \ Xoڬ׊aݢIQ!C|?>_a+@1\ƒr:HbF 94R'yMieè ]E 5RŢ4V5Xz.>v}BU d<@i'C5Z|$A +I Qx D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$ЗKyS)@ ٓ!`= +=' $$D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D Œ@Jh+ ɐ@\ kIf F$D)=@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D}$D!stH +3Z|$j}KD}$ D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$@D D$DZoEw󢵚~spܩZwiy88m;yOVa]3.68b)9,-ʛm`l/]*-o~]կ{B{R5!գ r#3ry< fdV}dq>qGYxڅIK^mF }Y-/Ru.t^q=.}-BzH-boկSY旭˛@irK+Dс#U49e{, RfyYgl;d}e=2BFyC׾5:{}rhf3ɂ!*Y[m PMf15IΏPDzv}c.~dV>9'P}}=${/ߣWn#/i;>?_vqy9;]B-HV=hS?n_ރaGW8g{'٬_>;>N6NAPRUJ˩善HEvTKr0V*3r+tw'69E;o+ktȡzmA$9s]i2'b\[BH(H2'ɍn݈;!7=Yg NqonMۜ8IwgV@>uo^7aK/WU<,b7onu<;wImu/n5/O|k:z]wa;|q^y[Rwِtһ9kkxw.V7'/vfۣw6~IzK|pn4@)bi]\]wX]XdA7{5:53UXZH3L(o\P7kaJ+[u<[C/Wg8z1߅:Pʼno_¬~?yPaXƵ}/10ڿAJP=:f7=Ƌ<-^snT:B2);z+Ga*'qhuzrT|vSkr0F7\Fùo f)e,km#9oyypllN0jҖZ?Uû!GTKy%}ٔwc 8$jH^5oªggd\}tz>9 oY(߯- }`g("6 $W?!eFfMA[.kqu,&KDZ&U|QxCOl?9>u~!.7GzJY/Dqo͆h,x 5]qL\IUZ|)ϦMfZu=땮ddM\]5lßoG{:^rᲝ.ܱƢ,ذL? hy;!6d$`$m<,8}KBp-4ۀ <`I)Bx Bbù}Jˁ N!O)DrN5GGޛD՞ZAD[((JfeC;8hd|F1y,YF(9A_ Ti-h8` H*YcGcW[=^DZ9YvB9rBWYy.w|g%(*k6JqMm'v^~8g:REh(<=p%gbSG貉;l<⼖7zQK'U3W_yYE2AtyaoBέ%[DmYvDgC:z}ޣgmw q}`8JSkouBZpqCtQS@H(MņN=310SvNe^Nwc%.ۛÒj=X-"H&^<#pNa5tC:kl!Ԃz dR]/ٗ5 VtW6g A8O}qh~4 ݣG!f;${Uj>܎u99 JXnVދz^ T/NJ@1Қw *yDC7M(bΙJP%F[C]6v(ۼv>hw%V cs4hriv;L` PPނL`mX޵{mlo~?ۺǾ"@ơ{;6pxF p6P`X vP.64Mp5j&Wlw hK^3He/[KSP/֓Usc>oA$f>n&[AnxVh+w=es/p^ vxs0")S&:| =aI^_>Z=jMl8΁=糋/AqJ _pFFһPwtTDG +B0W@FF9uh])?d 3I$zo=53ǦQ1v<AcyLY E9Z'8)Ũd3.g`nA &, fIZ[U,K# iOgņӑ YC%֚ 6yI$D9&b>GDh`?$%NԒDg,psDO#~Pw8_:r$o]bEБhԚj'Lo(>C9=*דz=>LWe֎HE6 F}9H̖e3爋Y3rtQ8b4_-!~)$DZ7ānLju!YƩ6&BRH]*s[9* 6zr,WDe>].]PVd8hBHj<'Il&Iǜe"z%uk'=ʱUa|X0wh` moTMj^[Gi:,zg]&M Ov|R u}X L%䫼Lз/kpfrX^ Dp` {SGLPy]URDѸ3*Ma*O?gh ,xWAwRc?'RwTUx\5#ȅ$05hZk|Ӌˍ4+גu7i%SJ_=3=]_a w9kf6vf%?4ooҖr0QGvk i=OHdYkOBn颭ڍdub9M1ЀUD:?/zst1cdwI2W.fc`U'i04͞:ςFצrQ4묱Q=+O߽?y72_޾7 +0MF w]c~Pݯ0۝:&xy$4\Or4SOF ^9(2S%"($D(r>ImpXGdsܮ:3|v!'jyP: `3TRI`B:ĵuk1F\_tLz-fV;/ 9ϡtmX"+Jj.>C֡OpY0s&Na2@2 ה3qؔ @&cd$F b6>罥BP$F39RY>- 3;|bsciz=KvM`s$֘M5ͭ_H7 Z\7oj5\m0}9|Zuū nܞ=L-StS7*x*'6bS}aGf>Ps1<ATӫ\ڨ% W%`j)` s2Y&0k pzeLpp]7uNg{9hrѺC%&}+ß%!`/1"|4D@'-Qy "dRsVh6)m)Kn2LJo9}`hp;L` PSr2l~9wzS=[\apQ;AU֍Xm2o`Q †Ko g^%AF-btRr^Ro5)nGxtBh3#Yb2MMY z:uߓcׂVTs=ZvO!1mNDS=)̕R h ⹱l҃>"d=Gb=0‚?Gšh^ax9]m 4·קtD}k3-GԿ{)DOVRENzǜ&h-*ei!% SS}iNcK ` dtuVI5q$I XCYtNE!%ZԚlN$MG^' 242b0OS&99SL<՞JPu> )c:vo>қ 0Zsv@.#By;&Uߴ̧0(>s3<DjAiȖGn2-ԗd"^$kbc%N)Nsg|] o#7+\6{d7b 1 G7H'b[ݲ)[A[lU>}Z{/ )$rƽZ&c)`/ɸh"#@2K{"'u/.3qž *V1 kZfOk$ ⴱ)rVGdޅklvWQ8d0PEo/`oBK%x6 .~nݣNjdxa,*颉v/(?sfYYdr+y^GkO|mP=!'(/mBVV>׾vZ (ZVJG )WkIׂ8}*Q efWZf<:DyD84]\RVD 93<(<4c{={5($RA / jE XvE&Zhʼn z[*uڜ-+p/j_d{ͼ/]q8wmWizBNX&Xc2QEm]IRBq ~=c=ܞ; gR ۳,'JܯO؋bB  GD'2 xB%^2)D),VxWXͯ|כc=ʜg.8ӕ w˻zrYK+i7ɰpQ)sQVIMzY`15 RB3Q !m7ĥ3!4eOtrby4-\nF116"0Oo F%B#RXa,Ȉ3c9;_wtMHIJ-{"NE#V)2e72pEX#K @cgF ^7֓Y )A kb|ާO>EkqfnL?12seᙩOx:i vK3F#k.jN{$.+{M%'=ʇM8ōB\1SuQFIDl`I9NBBEA pqv3琎hc&.3G4k ]Ͻx; /Sۭn5GG/J i~Q[ѿֆ(GsjdILQ2G/ RxPx%$`gu8w <ϑ@cd hE'N u1FK|>43g%0P)G+T.K-6Px36x@+1}!4]FjgAwX7/o8ͪyϟ}E׿JDfrr9Ϊںd] =2R?ؑzWоOh5n緿pۯ=z!%48ҺbwFp2f3[$=+b+)FL&b\:ņR~v]r 2PhiA<7s&IgUTt :LN+Km }罥BD Dy K,wtteۑ@ˡW_>3i`~Ga <#[aPJDBh 9՗D[FT$ Q&, K Qx4D9霠&J*vC&4aif1llSKs~r$|Ef2s%jm>9od[UyR#8ϯ^yh R_ * xC"^%ΜH&*qA'D5sͅ))=1 `0`Hh11(T IŊ3*|a/G"Ӡ,X̯22^,nߖ7Ap/np88_f"PIRID$+%^ZS :c xVA"JqF~VC&{.0½geɕ%A31Kh*AL)챋gQXG`Kcڴ=il)E4wrDO4)R3B=AHJ벲I*I2dD_G&DEf@!pD'IHv?,V!]Wv}="+Y{1nX%$TxEmqML49e' e/V#Ze@n&)ZLVJ\L ( `J<+q#6G8\g^_"b7"%\y EH?c<9g"I M{xx48T8(pa[2.6cW 3V-(~TDRsǮm:xT2^OJ=,"tjxTPIU=qKbzLU{oBkJսC݌ Y^e&%*jrlS2lf%R( u1er~ N)2xHy&2bV:Q1 gB~'y{}wiwf2mOa^l{Aj>f ̍lYp3a'qQ M7\5(8l<iFz!iA]<øaZs)PB|:'}`ߵ }(-sGa᳏Q/?K.l& ůg9sFv_x`2Z" py{G5tT6f +jLCUK7hg~Z\M]&"UBW5'Տn/%-Ov`!?﷿)\G*R-Kodg^GV6ViVk]&ԜVYh?--)%id9`cOhdF}#*u-p;yb8+hl=]5 eʹ@S xmѵMXkPM}7eTG}S7d%8\60 )Ad*R0($ŝQ R6|NPDsͼti˵R1JNOP10Z8Xqv(̂KPgE$K1B41{JǃHvs %M+w"A;AfwU;1T*hz7{_+yp{U,_ͪǹ'|SrmE zyoUV'ǣ&_}]7߼7߶Uo&͛yw.,XM[rU}KŽn>]b5o|}!%&ۭ߲nqHO6jCA d1 R].qV fgvb H8ǒI~).eYn۲q,7[_g||ټSΚ/*[8I'zbQM Q5[W?\y%~*JJԔ()+o[Lڳc )k'їPd}BҠHP %~!JŖu +O/ Mb wQQ1ZBYzp:z 7C:f+l~U-3w䜾Lݛ7<1z4O|.zUv֋gH/Rl,zI#'Lҩ:e'He "/A2\& HVTus; nmyT:Y7מ Yrڮ޴a3A=ْ}!|z2ِt1“rQ7l yDcSZf ބ6Y?8Ȑ5=/{V)Wr{T/qTH[w8t:/]Ѵg/Vq[6"}]1XcYzaOfYؑސ=eI o|)1$Zϐ}xshwWXyn}7MAFL*dC0 }UE)z=P[N>(.:FW$|UbI&kUО*;Ic2P7EN'F}b@8F#Bh*5WZ`%jXrIaP(:mmM5Q@g1SJK$Y.GRT7 myDoD>۳"vAx(hօ A 6 E(Ka͉D4Gc MϾtB׳to z1+˷".ҴEe Hbu1;#!xwuPH;/>X xL 8`D 8I542!%܈%!ٗo"E3~ƒpZP01ʘ :!*4 JEm]srp `*A *(&r:f@*AKH !EW:Ҿr۾f;[Y ubűNJeܧl4V `Nv<.8<ɋhj C|v6fh"6llu I °eFa$JkLPROVW %:[͆cSQJAWSFYR"`B g4YX '4e1;ߩyTqV1$3/]u,mK|TX Z%C}KB x`bo$M,3wH:FKGH-љ8zoy0lZQ>gD,/r%c̺.wt[\w7}aqӻ%Z%sRN0ϭetZ^_WR5&mKxO;&Wo `=+%i/x};6eѢxwoœ_Oxs-mL^wcc<)'rnx,xM~oNd=yW]--8n6]ͨfPf._|d* :ukNN&zvxep݃Sb{Av1VWCrݺYMIW2RF^>?NϩV}1ˣiU<"gMs+C]5Eeg'?yÛC߿~y'{oo g`ntv= zPЁ[Mh4jZ;ء+Q/'|vmCh|˭9 oG?_$}<[czqk:(IW,bdz-+),ʝiQޮ$pG1j#L`F^e'C hݛ"A{UdlQ L`)*D줧v848YGv-^;ډ-'ѓP `!\"R2ks^u*5JB_9ya:7NsV ;[BwHmzqʏ1RLH_LQF$&%鈵 z`X1V%eEF8LwJRiL c1FFo2F , 9RD!F/Ed(1xFiQ`?HupF> #]JATm!Do[h P=ۤEte1I$OeoJc >&)p&@0ҳ[w6җz~:r8x;XMV<_.^STLS{uSK3U ֊ꥤqx?)Sk[@-f~~vZz>YcSTV fEjEh90{0yT͏5 azfFŤQވ{9hE;JgZc gАW9)I@L ۜElAoOFg*V6emHm7 oד558=}p+[*%2V] `.lh6_ FxbJuVD͡4rbj^]]U]]Lw}WgC@@$d7 !jXQ[xr4r4T[ Xjr ?L>%u 1`OPu9'` ojn.{q&Ol-&W ~JsCUl͋MF\&:l8'wOwU{u)#6;ƛwW@MRPߖp-E![\/ ^-cQW=bHgyq*3cFͱ4VԿz}Yue[xX3sd-%=D)# `S$-Ơoid|Ҳ 5tHA@WXB$DN5@A& $ Q.6MjHehU!xBN/pgvau C\qr./%ׄcE!(P ulSU׌o@D U~K>8Qg+-B O:TD@O5T8!}=\a8wz'b>3dVEdH?.;{6Q?wK&ijqW|Zrb'/΋y5f&8iJF &%0DŽq͍N6p/3$/[Awqjޕȶ L.4_bM7r\#j81nɵ06B$(Q623%m>:3RIIf9'S28 e]b!ZI.7ʤZنvt\2FiYm\D"FBо) {k#ɖ62%}tF/OoW"x}[[Os%kh)Q)@&/%O]-ᴦ(ku ^b;RPXCpt:Q@ he/ .iٰ4FΡRp1NX$,3F]"$L3ZH5#Vp\@т7b 9|xFo^hڝ R"IG j( 1RRdȨRK2rJ(;/eh%0cg ̿kۣ 2ز=m=Ɯ`+c;߬^(vURB\MʕhX bDᢶ wQq;*_J:j+Nڝ;gG~[ɵP.p K˕TEp6; A$dzT Ql%QOg EѢ4sli'>/O2W[gGv>8|I,:gˍcg=uDf:z0)o^->k8kD(Z^8l^S+:38J+HH vkWAxdV=փ9ZN1'13؞'t5dԠNMo(j0e&1W΋t)9oڦOcTx2./Ф dILYH:J'A%j*ݨfZsD FbЦZCAV R6jZ36Fn؜Ҙ.l3Յ%#:]%y ]70W7iyW~@aїA256@ORE6B0! }JV ̑ $ZuN lTc!;{60ceɥA퀾)4L>&nXc7FnFðbƂm;ڢi; vL&y7)Edrt #wDy͕b:nJSV~HB*CFːb3PׄHdi/pD IѩچacևS? ʊ-18V#5"4b Ę cM*EPWDGYăD\gQ42kT#aopjT,e=:Wh ]=ih( 94Cc7|qɬ~y:c"4y;Jť3>3KVANڈLS<Oݿ\Rhϡ[c'6mчp>< _BE:~27s7EࢻK{uяͣAyŇQob~OP(އX B-=/0n:A9 ,޹ϗjf(pɃ0U$Ng>Ny3!hμ{i0HFOr('{ Ϛy՛|_ulMLʅr)Dz)BP]*#ޖk}Z/;,8ՙCx8t%6^ax7Yj> 9_޼*~t7Q}{MQB$tЄE N8S$&ŜYIWtS4lBh͸>'-,hϾԀBHࡘoe8ik.+z,0J@fVDI{2V>#~zOӥZįalaQx JV't֚f;,'ȝc,kEY/*޿9s.zKKo/$'Rdr[gR 3@Xl N#c߭RM'b볃 R DZ\e4OMvꣽ`?[ryf`w3bem#ӅJaxyO4:-yxgV(Gy(qr?ۥj/\{m_TPR>^/", 0?kU ?Fv&5'QB'q:$\B5,K2 g%^ev %*^?G3m1чpb !otԿ[.Z̗1si~J| rQ탩bf e*و\]eo9jx)0nދEnعhx$72bf@_ Ioo [g{93\kKsZ &<")vi>|oi2w2Wܽ7=Yq6S"kfU=5z#[hP T$ *a dTqͯUJʡk3ͷRWW2kYG3N[Ι)Ryt$'.R%-aLSoi]{L뒘 G5P;W .1\ۜU\%K=!& DBAg,  #lij_ ˵c!֗cI.~.Pֹ|,q|0[H%՛ 6JSZK-BA6ImȲkš2 s >8ƥJ9(,$SZR"*0"Rd $Gv!KHR0Cf"7dž;R Dmw $h ^jŰ4(dhn#@3D!*'#7AeԹS$('JpR*y%Rxψ)GN7HBTvzv!T7v)NoLJ]&Zi)cOT6ܡ4Ί\J3'JyP6JiכIY2a+֢ƛ ~\JΘ @<-neom/eY<櫌1 E%勭KpD9#Ⱦ7Ͱmg6MeSm 8 nQA'MNzDղK=B&㪞I|tx Wά(}ruV28Utu|DBKJWAyA3 vXmAԨƂBG1 P$bDM W$DC=? U V:?lq@) /. D,/xT;7v |t`uvT$ 8UQ+ɉ᫈;͊yn[&sJ $'a O6Vx`v!Oh/wu2u1qׅT,S0 w@܆3pi/F9 (T&o%!(;݁R"ԇ e@7 Z%tMކFL3uz5|< hwXUAzT̨ ƈŮD#;8-]TZP PHYcvXT5I8KNF|d)e$bzc"Õ, Ƃy7* s.pYTc(! c=H>Kt Vո3uR1M`QIRwT:[fy/z+Ac-B@̤I@ |eM:kP0sɺ6Z43[FŤ-Vε2b2m5]wjgYbV[tq;-d0L&Sns`*i ,E2QSn IԵ<] 灣dus7zS= 푾0wA]wCP^o:@s1sLtwVdskh+G JTF˄Jԃ (0 DUZ# R X 7nիdXllFV^MwuErq\ \1DN!Fu/V0 .TuBJ8RŢ4xT%#>z i,:2WhâHYa}}JVl,2!r#SH@jpQP/.ΖfR,JL:-pՔq'k=k$Gwu΋:`j o:k/];J* Yk0m@ Q,gfFZV2Zcԋ.!WQׅX~){~ Kx>*5A=CPb.⌑TV̯:^IJc~XfT 9$bR竉0K1:Fff%d$\FCdN]Xr좀) Z`4rMTF Sq1yCU7B7{q|9+s-$kABhCCj _0ʵR'*6.RU:~؞ͮbO;ݷ[| q}@y@T<%Ѳ?v%2@VR}J RdH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z@Vs@0س$(`-^ +&%רX@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJW %))`wOG QZW)B%PH@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *5wOH T 6(>%PǮfGBJH $$@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJG tuk*/մ>un_o;}Mϖ.AՖ;0:_N>zHz:W5SgsYYϲ ͬK]ƥ|OpGjOhtTKHƼ~Ѓ)>󃝜O1"iO0"hs('YrO< g]}t\қv1??Kuݡ@vyv:\?x]:ᎇ1"aūzKƷYY_0l]^'FߨniPB|;Px[ZXl4ʖ{1F:d}ǀ{dں9 FF!#{*Nٶ jRcƒ$Vs%o,j&[/#P=0O4 }~nēFIRЌ$k޽[ Ocd\A ?}G0^\ ؕgr>c-P̧'e=xWէΐ/~ky1|^NQ:`u5eQF.5wSo10cՑPLcjwOl ٯkvɠeG<_Iʃ]u!<feQںSN3eQE{XQ-#Jо_^~dYjκ{+b-ßŏoN˭8j{7y_)?rNP/N7 S;q] bmj{7U ^>kz)9陿pB֣֫]\/{Wjŭ]՘K)ztN:#bi|EPm֭mNQ+ߍt?MM_ [ גo?//pe@|>]gE}6e{6|:5H I;w近~d]/iuθu7?m}t lΏ;۔UFGU^q$vZV U,2&J0cw˴?gta~as_qYJuGpg1CV|S?Z"m|Ꮿ]\ܴltGO?^zI<޶ӓtqxz6y6cⲝhk>j$6/W׬km g5f]H»>іI/9fsD 1\PR> Sg- 0UɩWmNWR2.A;&˜K@:; w;9tq[J%gzP̉9V_J= !Int=(傖w$չ&㛰+ٷ6;-gWKʻY $)x,=Wx:ppuI8_^7}8AeStH_vxWܥT s}T?AtR,<د. 񸬿j65.wYn~YͼÈӛfrз1_V˻9 T*nq q ؝O>~>uX(K! y\ ]oڿlz@Pp ›m|2+d'{y*n꩞Hnv2.$c׺C E9%xAv濢S1%D-k_muu1i`$">/5cLotJzܒm$Ja^fg1e=2 0 ,h`g#m)b}#XERTIzLy'|hd-#;{zAn+9EEAW\ŔĨ@pW΢0OHCH@Q2$kX HѰmrU/2R3hDZ 9>i4P|ґEn ZXA/Ol#݊`7=d{gʝ޽.{[|;w{:3k_<`ttQ'_|V⬒=xH+Q%ӚW:XItu9aX >˽.xHC{dDr.pCrV+ Jx&'p&DcCvH .Vl*>E($ =֝pj] Q$ga!mx9y1ݎ2v\]~RNpOz8K-:aX͛}#y0sM{}Xhhu_?ٖzYL=)nF⪪7rxUsatyJZ0Q FGeP:Y!tȝ "D锨D UB'khJ)jg^s|k-bq6λ@(nNcUL\(opO3C-V\>G׫ooj|Su8Xs79R;WhX_ߌ -l(oh% 3pfPԽ'|a0]>/9Jy|I/5KeM[ 4PPΗ^zԄYN-ѐcx/]R)4Ph9pHo@焇  I{&Yd*ȉ1'IBI:T"Á_/~ }L'~p/i>tMGk?M #\e cO1D;Ĭ5nf8kЦe2)+HJML(-W8GA6a|ɏg vt8G3 NAK&#b3 *P (d6<~6<>R3{n9q?yz~т vs0cs9ycCUIUf=@:뀪fZ[3'«CFm{\tyn9_*MJT%n7?:rNul<{ A'Cdm4%%MNl"PA$X2<(b]ޱ֩?-x7 vƳ{9K*,TbrУ`Q59Ի@ЄL$M5Z4T$9CsEq1QqA-4_A{I^]S:DJۚX "{,[!-uϝ|wA/\mSB݃HFZ #d=uuC)j;̟?5d9fUB8z2HS @A)NF]zH$9`n>61H dA(&(S0Vp*!R8*"J1&*&KS2u*^0^i!hAٜ mL b-g1P )Bo޺gzL酆Q6K`8"t_n(owb%"M"!)QPeIǜ8%fѤ(5{Kހ$7mS`x$%VhIKĈ0@"p}D럒{CO?5ɢ=3Ft NrFn|;9u j`' v#2(UDdK$AE4nCR!8dk& Trm-!c1;(^Qy{[߿qw#xZ#&DFHĤ0NAibtt*`TfNDoW.EOEz[" ,|m>(V%#*2ÅRzHB'ĥa,@c23x?G1c_|=m*gj7Vora.RΊ<5ȱ;m@D$J&<_r꒰FRhKt HȠFe:,P8'͉@.sń2.:ᘠ<'TID};Q{>"yީtȈq /;^J 7!YNQ,Q]ȴ $Kq& Ep,k\ڻS, 1&q$O^f <$TX4hȉc"Y@݄KGѹc2~Ul"5ue JDDyM`q3))eޣa+r4#4ea?[b4 TKJc)TT:"| ɳ $%<'\kEt@(XnV41 x,ƒ$vc8o#~;kb]{IHo[M$LHAG ůsT$Ώ|19*Bw&tPXܦWge"R+X"/%ǟKMtVߺu6 [jݿm{E{RԼLu=\[kn?[y5?~ b;*67AYi/zA|io{ BzQ1L?!#ʛil/[nmާm BS1/% xkM}J3gw"^ˋԃoH+ҀךY vHּɂcfOTp;}=RknMjb^ZOtĶƇLS dWq݇=kM'hqkCL$]wEt 9(ȬӮka?~ 6=|YE 7*X.7C=/b8ɵ]y7UMZ}_8fn|/΍MWmɃ|=An 86_PKyQ`i6ڤlg+ 6HR19ՒnL[)C#.qM,>ј锰lNqPZY=,z w5;{|d ")#.8b+FЄh 6_!]ب!cI| l+g#᠙z^|,/gigÜvXѬDzˬqڃCI\*ZHT8$|%4sR DꔃQ9 $O1h($aURj4HN2)oj "A@L$1B2əq g j=Y1֝݊BXvLWz=A$Oy6V,;DG:UϠy~<[H O"g 2Y^hCK1D$$8JFy,adEdnY^{ 8^>j fYMv̤dXFAd Hka]2/8d0SwE}[v#m /ij m҄j\~!*4۪ixv*_mٵ?g'nj)$Ϥ=㋋ك]iaC/wIp`7c>)~oP9o1ayR-Ѩ*FLNK?0*'8khz5Ijv +yR!65M޳1x{4yx>冾&͛M1cS&WԪUU^CŔ6NboDmH 8ڧ(d׀XuM~n,{Uͩ^93)ץo5ǚZ }E*^xw X5?7bF Ք^-Ng/B*"W:2ͱ}L}i;8IFp@2i,/ٛ[`=b?Y$m_dc9q<+͒cɲLm3b*vMwMϽ/TJڵ_Ɔ^x@\%'x؎rnmyy|%}m'5~XfpF4Ps![ #o^?]T6goMTCnj  3{Zr=?||Y=7i&_kntV"-D5q;cYa2"=W[ 緸ةTY5fW?ͬ+g>9:ܼ\Ͳ<8`f>9d"0N&/(N[%rA3"t @*nqˋIVa;0o (lV!זMA8͕`*&MvAǮ*( ,dzH -Q`I +~L>kD#19N$qtf@vT@V|rL2$1 VrT ͫmU ~S\DяY%)?dGBiZie JcK d]=RSJMwOY:}[;>컍Kld6ڞ<1SY )>?\wfV2&pck`u>V8IS}.vlqxLWM;e=ncj IɁ)`q2>dE) B)4.Ǥ$eCg:(I>%:N`୷AtL.9:k5gKW5YJZUZrBh* e眝u'VIɪ=w<6\ hRD2u#3Ϩ}d9b"KR &)jha9<ƴ6s"J-MeThbȶ&Zs;A&o_ ,],SBziGP0kk&phKCSG=8zjp0,6MTH[£ DeaKad.8~M#3BpeKyQ=gfP,g(_!>yjFgNHܮ*䵾~ݿ^Wo멟^FK<]0?Pog&1JQ#=|/ uSV~;nBߏ:u^pJZ64?-Ji[h-AL䉱'ҲkTlCZ6khٌ%^fЋ,`#f-c^ th 8Se-FHV7Kt!3@|8a\ 3S)bfdfZAʧwZs${ !7 %lg aBT0NЇĢ(JV6-F JmU,z7 '7) 7m򉖁T1?v>տ.oq n5ҹ85fL uNX\=E7´}}:7@PhUTne^1|:)gotqi<[aXc#P:q: yiٹߠstA#Sl\j<ܭ ':ܢ ~}(-%0E:A# >KNAә_-},->y&K"=q64N:А9wM]5,t謹N3 PzD1XV2?ډ3?\L26ٯ2?z G uYuYhA|:_^ުm]/kUC8[Ͽ_7VPF26P9 RzV 'T@O !iͰKV3h`*2#U'adnjSBA*bp⎙B܄s8ͣ#*11_2r=r"KSblC,HlɚMA"N"' ԝ[y^i 9ǒJEIș֙G+/C.x#)a@?:4/:{s߂~i6  Ÿ]qc+Cu\aG/%:݂6-g+H@>D%K%[%MUT! G F+G-1}X6["[X)Ĭ= dBQݺ,2k)Cb F&rmH nZEĎ/>o42..mAN?>~d-}; o_{DGb؟ h ЬTkU2T=Q4r-bDڠk3RIGC/Jzb:t >ii 2E,nDpQDs:gDF+Z 9:::oG O8~MZ~0]q 鷾L<(QBkYη+"=u۲dzOw~B8N^H?>O&/5G2`hC֪6h[%x2R덪q))Y#m;T=RDؖYFp3XeVx 3"@{m4de0+c*~U~d %wH4=zG5@$`` dkbK${2WdVKն53 dbMEvUWbanKY]~ڵ|s/uwsby|&OiY\޸>OEx)5Sqou5*d.%PrǗۥ?9#7* SQM 7YV+LeJ@D%}b:Zqwdwg3_mZ1r~ڶY83=]VQedARZ^'$\H4JjIgB,(!T}{ӎ+2_w:9εs$h<]DY(lNZabF79%x# x:sW4'+_hf(eL1L',o5mHDkB em2H.g]kVJF%yvDPIE|ɦFCTF9h/Ud tT.Ȏ"S%fltgw_#ZѭplNY7wEjjT5:ZLr?[nB1+)TRk+tCR& ZB' ,//|o_Y$ &lb8%vWENn~쫠n5Xmk WB[%/!SWf@ZIiJL#Cd R RV \ٞ]$_vU6#t_Ҧs#m *f/sZd"ZX@e 9H&41͈JA]Gڻ]o۷,F'aXaM6l`GgMQxDSR 0w: +mizƅ"RCº[c䌶4JtOYkD$B`**@?Pj{MiI{bFXekN2,DPjO,cH6b_0[k_:̫LoX/PI6Ն ޴4IƊ3.h*Z#.qcEųOY$/Hr%/a"A3+/ni5ʬ}v][S3%6M_>%oM!7Ս9-ZWm__Ox6SWĜ?F|6zwڮtk Vx9oڿ:➥WFjm λ(a:Ygi^44JVbZ'㳏˅]9:ͷLNA>r]vՖ\uVrꤥw)/g䪸fO ^Mڤ;7NצrSMptqJ_?o_¾}?޾kB+0OXFt`& W?ch4rhiatA^,>>1]Q-7W{k#@R~qf4k+.crqk:*(q+D6yc\t*Ό{m܈7U."ylzpG1j#L`F ^e'&uS <<%t=x-Vii)ӑf(O"di԰Gxdsܡ&rH|HI&[J YiE/lD (OҸ\s }'>ׇ|a:&sV ;Ý1bs9ޱ7տ4f{E?. )0U3>K?oL묣D2fZ4W圡j>-b._YO_{ygvk~jԄr|ըԪA/OyYZBOvI;x KzU=Jǣ/ɋ{W^L'WN[(Z oګK<6_Z~y[|J>=OyA+> 6*Đk#!d1z*}Gݘ60jW:x <7oC|W+6AC9pt| 29M5Φ */O/Ak8q9뵭E8R哨AX;FkW \ʷHf3 c-4c5puLEC)ّ Rh~Z]կUߊϛEgK ص1o12ԩBр:3dD[-Rp,鐭1B*i7ǷIɧU@V<\>L2GsNz.ixzFΞ'-m^d4w_qÁtnɻ'c/~%FrMϔI#|$u1 YW>Ai:s*IR4V^^ނ$ K GrU;[-49~瓫tDytկWi&Z,KkIW,E7OfcSj 7к9m_\|ydcʞȪ@z#&%FŷGMd2!c!Q*-`5h欞\뵊`v xv6:?O L.y1b5v~7>U&R{<*wz)zx̖Zts75tdF8";pGƕس5Y\m}JlB l%k5[T\uMuz{َNWmvgںM nݿm{V|z5Pv|=z8O-#jCd6x,ld$5#b0"IK$xQ<4S 4 KRʨMDC)E!1"%㓋$Tb?/&s"ʷΛC$B&zI])d)0ir-TmV+XML"a^j6C݆޵#;mŷ;w[` HھȒRA+夃$[$U>X}c)B*7 VoZ%Zj]q>l'%apBifǓoWAASڶ 6L#g VϛsH=v*b%6|I;8.~AMz*ZY/_VtcM\/< yܽby,inT8@;s;W4̘@ٞ0#{ܾr2=t.?μzvឋ4K3JʳE蚸XR"=[&NI%Z_ Qcufr3xneky@`Jf.NI^遘Oۃ˫7v˻ry{Vƅ﫞q2;Dlz>Dk_lZaj$5Z[ct0. Zi%؆ru,i:r&ʙ#gbKУi9# ɁE2"B(߈wT,c <N v߶`wM&p1=Oe`5) (EǍf`<0FHsE']kaLȱ CgCU3O][zD9mu:[U'Z*fƚmbNP $Fٵga4Ռu|N0}|nωeXՓ ޙڭO؉bb2Xd\$B>dhEkysNR׎IIGUbWNOt@Q>%Nnv(!T1r@\XilI1`@)Q;\,%9'J8ī2)=bsU[vscc1cICA"T!KDrRGKjhWS< ]`<*{uw޾B3MGVD._xq?3Lpn[,|'7Y3NӤԈA/ !B+ӭ14=>$Ho=_{A]oEo3ӷ_l{DHK))꯳Οt.p܆pU: [)-9gޖ͠-h5`=Aաelp26TYPj TCV!˼3,ђq+ԝ3)`,w) }K8d39; ivtleRG.A= |eABLɋ$e$ JAd9&H EtNrT 0cjgӫ.4`i*ɰ}MP⟫I1(v^\]_WY.׷>=>I|[n̎ݩK8w_.jzk&9d"bӑ M4$kteI&a,7A!mUϨ1UK!LdBV] L,IQs-K\:]E{ h:Ddpu Tu>Q'ܑGVDPfl_܇_<Xw?= Sn8,b4 z"~|G@}E?<{Db4l >^{%4mI4(rƖ" ]zྒྷbz476~7`iWnF\߻%X+mLEx!v#pSAtuJPzFͫ翼>iۑ+ox?8yM _}̟ a k@Z&ɾ"AJNv"fŗ t{I?W<^qr!d"VRV,Zˬi-F<{"Y+sVdZ^j!zAEdn th 8Se-FHVWKa/9·G91%kHZAʧNlam6B>;M"+8Y*-60 !p^pՂ>$GQ9keXFxkZ`@ՓjfMY8~|ִ/Shthٴjrf?5Kw|{|=yϗ$/_6/Ώ hpz0Mvb6*k/%}\u"ѵ] }s㛏BƗӛi?npHT3ÛA+FLO5s)Ѧq#B1ƦH(f>}:2giӰ1mX147p>ubbqK=rt3fi.فoT߸WqKέpA0%;:U#?0>O^kg?A9 BOٌKܽ'bܢ9~WJJPk./#a隰d9_[N}rܗHoAkXjA4kC]bΘK3n:75հGb'_1e`- +]2S|oxauXOAymMgwN8ݤǯU? oRȹu{U+}S߫l}|, RHͭWLOM bRR,2k)Cb  \LlSw;n_Yo3/'QS{=p+='ҾAݟ%  w̒jC}ZUAʌR4קАIxm2k ͠NP& }W0_wg SJaOZlB0zeLGD7O"(c"Ι+JǸV9kD(^- _@ f1wܐyQZQruPThfͶazOt۪p|ܽ2i;c9|h<]DYmNZabFtBol8;diϾYǻ/m)IJv`Y>eS,&4)N&P&KQKYUFzŒϐ}Gx}hwWXyn̐}䜗ʗl ,XahT,"I,URGّ]:U`h;QF~w63d,+<Ģ1pR!s`' v&/|q,^]TxP@"[RMFK.Қl dL = )G?^A'%ގwJ ('9 %AF*¤$q,sl 1G%>: ֟GhH㪦==m|g[buv2ALm+.C21gAfE[y! K٪q`c j9mg?rZ+RXnsGԤn&C-H'0ZӌeYF)@3@G+O.bH6b .x߶W'*C%8/!ܦq[JNx>o0oh?_$K./a"A3+ίiikJ)VfV7ߘ)rj%  ۔=i޺j~im?x6SĜt][Vk池6r$}=M# 5#Iyax0a7˻4bOA+XVb1GgS{ru1ɮQ;jKH]:-9u҂GO ,]_NS5UqIwoNצrSM{mgq m?_޿yo^o}ͻwo_,aI}0 <%@{ m[ -a5Yì ˸)w{}rcap[/i[wqCӏ$3uTPPXWlb|-G8b#U L$hz`W{0 "Ylz~M9SF^e'GMn'yY`GK^{I(23["#S#H ȵxiᰐxdsء&r >$j[ !Ҋ^0@6D %ap˹A'}Nw0^fН !Y:MyyJT?>ozXs PK&Y2 H҅{1u:H87AuQ6rYsΒBȑ'v*92v"+F$ V[2R ч2Lf"GsNx.~{ƈQ[soH)LPg4RENuY ao%h+C̄`eG?#nOnK&//q< q ?]6i3q N)*6F ^RAGG'ԕ꣢ egGYÅveՄyMj`vklڇг5:DytfdӫiEPОN ł}fӞmy ;&t/˶} '~V>bʞȪxƺwoT ][xyѐ1ZP( "1'Wz@s γG4tlpAٵagm%V^l'ν>n% 8]|i<a_7i}..t!7z.M /Y:+reKsKvY^c%] ٕ;INu>Ŭm<_喾U5n_Zm뻻Otlt;߭w{yPtxz/~5]< \+~uK2!L\>~mw;,boܝ|)L>LSiCSg3xڞVu{y7N`F#doZdj .^B`F.GrvI~pTURr$wL |v2hΐs!{iTb>:~w",4^]n.{"6Bd 2,pw!J/U V{SR"r덜Dzs|H~2|nA=b+^( 0~c7`ru@̀W ZǤ.^JcZ6Y8S%#e=8?=YbByf!(Yܫj쮘wTb7OgZ}NqQe8EG=-m7ϲ!a/!l,B2ol%f Ve&& ׏,th4n_v~>qsRTg15qh/i9G߽л^}n>c<<8^*o:FN U151`r<~ -7 k_TjڠF[贋* @*swHulnSF>q[[aJ,jy1Trk~л{T?xZbG?~Skп  8#6h1 m1X&%a]跄P9|Qv㫺dŧN|bIbDmX2; ^9+Ud2_ psϓ`DH=Wbv)zBZ"$EʨMDYCN)(@ O.2B,;E7tKp=hfI!1C Q%s`%xc Z N\f-:xvWp(`hQE}[v-݉ /! m^=xB5C2 ѻ$F鄏b ;l0w/ |B+9x߯jɒ%ʖ= &rwK]dUXEc muh;N9YP[PxL?0[c ҕ>WoIž"ty]Ja8f{%W׻x>69+{& ޳J{ۈ+ƈ\͟|F!{X}tu,enbR8L;91ML 3!$tb@OW.fsf<ҡz{$uvf5b6L^Ey(41z&/`2Qqݞf?`٩"jc4t37[9QXj噹d0{1IN LCtQ #n ~KməɗT鐬Axλ+9qL$9(V¤Kf! (śC 2hU}lR%+EsVm=Am=ΎoqgeC\qJI}^E +/Vln(5B3`L\imnr1p9pۗ9Q:j&3Σfb%h3g )1pXz\D˯e1wTBS(})2Wpa|ˋճ껽#RIߣW3 #|ˢY:&4JfK{ށ:U^UuP8O1j-B<*en@yg5{O=#k>3hKZW,K}=yy'["4ٹU^->j=[ Nyc֐hqKL@Ql*pV9H} dmp&֐H%>Nܻ6waȅQtALSzvӍȬdhƤvƺlų'723;|$u }';H@Z6{x NAd!E&0.U,b+x FTtCZ+DmsL l] }%:WkRB(Aā8T cjZ<xEKZ1WBRA[ݹGV?ۡ4 @(9$ N:*mCIJ"9d$HmUd-QOgtFѢ9(iG>oO2G[gGv>8~l:ˍco=uT܈5_b۵W/+G%=Z6!GK򒄦OɪGqQR3bD}0" $)]/K<Ҕb{IgTɨ!LMٗ`UvI>fsJßۦ?b:NDAT::&$,`WcF Hu='^ꋆQz{!M7iؐ㥾`d.y9k޿ߘ=q+-̿Cw1)&~ݐ>6-ZQ#2Sz,_Rƺ. tt~LiC9JtzQ'ۀ~h(R;Ѯ2龸Bc?A/|L1GY0K7i[nb7}]m8o.۲^Y#_~fMոrɻ1\|al-,Kmoim,-P[8]U[p4i[akC0Ve7Mâ1&!'JvE@[&?;B䒖p;-tP /ʾ,b%m+ASr)anS18-H\dՙ9Qtdvt5r T^LLOW/OR1B":|d#x`JqPeފx ZV0#i@!LĈxD ȝ^E53vP-rv+j!dq4s&'gR%]|w;n"w"m;)uMO/'3pcK\_ɤddI:UB2\{%ȈF(G!bȦJ F阭'L N\R9R IơXϤcDwr\, ӛ$t'@㮮^ '/DN~BE*%Wi)A:cKdIerD H!ɪqXɿ@P= ) lJ#D-,vX&C,hS(.gc֒sAmޣvonx ls)Ks"!WAB)Hc:,x`KAq!+.x8L!פ &Id8FcVrv2V#g7N, b58EeD="5nq^.g5#kR3L160J?Y iThM)|@rQs5\pGL̃d]MV-D?~^GočmF%9R7tZi(}\A`{=\_Pr((\ďxqG愠RU>n\O>DwYFTZޑƓ2&.QY);[y=|_]oG ZB-T-4̞[x6}k /`rP]`*d?a0͜l`Zs2G=pR;Ӛ %mdkRDdpEtyقn_zӅv](Ig>}w$?G7?;tzVjЕqHv 2jEEg_WWeĄD Pc{3@Tfct6îGV'ěn1BjntqsIfr{=^(-" ~W\͟QqTE\vǛVc~n~A%W^@ɒJo:ou)nf[-Im{x=`:B8ԺP0֬A(g4GEz0-<0eck牧e;yD Q\}xȢ 0'i69q42:ţ'%.qw1m.nj{]坠{ I>T'}+ԎpBq*~o;2ی\p(13F dn[wx;$pe%sD,7Ih4܅!+(]}!J%=9d"p4\ @q~:Zj XRKSk*ӧϣRNz,>!z!QC4DU",)4X8XtD+p1!KHHI*>o+@9xꨱ\;IxOcS0d5 ^0}% 6TRC&AH&I:p{U04! A1>$b4i.lt6kį# fR.QIHRH@ aǬWNYf@͕ iyhAIiu̚JZTi6`6D%D?nn$ \3.Rp%ZJ㑸ͲeDlw`:]N&Uudq$8 8qZi;k!H#&j-Jhj[oZbI8rYu] 2iE 4 Xfvte :' 1IXF M@ՙgAKՑJ.|L%IsKOwɢсL++7kBʚĂ02YQ6EZ#+7dsHt)m2m8`U% |/)6Ո &yY{EBb~UTɑx 5 %C  e06jueH WE7 U+c+:c"&23`鴭_-]bOl̷<kc@JbI wby}L Od.f;슦(sU QA&'x=F/v )X_N$Jhƚva\#,o+,4LEը RhWT*[fq/zTV֝Zz7h[ 3C]$/4$oYBeڋ!K-AhL-1{®pU;ǾCDC˂)H>vNIl;e:Mac)#̓s,g R/>L?+l$+IU3ZIU},FD <,KX6IB0oJF0[8Ĺ'=qsO{8Ĺ'=qsO{8Ĺ'=qsO{8Ĺ'=qsO{8Ĺ'=qsO{8Ĺ'=q\νOMꥧsfW'ù[Ľʹ8_零W91 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@_ zٍYj+ގLV6~{}\P ;g}4=/+$C6}l%M' 8 NSM?t6|hCYm ⣿|LѨPwk: E 7 <-6.6{+:~faf'a~.{WG:z) dE{/6:WޕM},RsT0+ w9IMOZ5mia z-<ܹS1P^o~WpnmBz8Om>zDdt%G|p{BAϐd5)_D6>cI#Gsa do>-W` !GFM#3p@?)S?0oǗ/ 8?ty7yIH3/fG/ [zߜ0͖n9]zpxFrz8iy05kv^iͺdX9hwc M\eN ?䬯`I_^7$zz8*iZx([(jwtӠ{wm42sFӺC7ǟv[l܅̷.i = #phsdDoPuAc9~iF( (YmgBJe6f"2>AkS>-w;qT*eg^[XtHYO\K"e[~BOya>Fo&{Ef h6@7߶vPmʫνzNn8~ue( V)@ǧȶ9:)%j8׮CI(eϸ:@BHP){>cT~Bvx6D2Z}n1jmefڮ`mInH+zeKNt Om/iNxa;%@uäνV_~\[`zu9eV9g|A9|mx­OgOĢY Ft:qp1#Lv~t?wO~PkPER%:# h HqwkNK>+hu4OF˨z =偮ڀw1>EΗ_ BG6[^e7>x#qS'd%[w学swj%M!8g-<`;&}phދӒQw7vN]&P)g[t}2-o6MICBe]uIWVXGxKޏ?4{M[ϙ뮮k?7}ߵMzv;j;o%/whӦ˲QIl5y>/s_7:o֞tzM%i9T5$י1"0cO1Sj`Q$ lϴɛx}C?|#h?8(^xѫU] bvDž= 6Oz6l>Xk,uaw۬W? s%q z:-[?*;>hvܾ}u{xTytZR\VyRZkn(jk[%UqQ]i6k^|m"SbYYk[s݇VٗgG(KX22Pk{ZƋŒ)0(KrmAl%`g_jC׿/:q(4%Ke_l|byrO^9 ʱ,,6 sv_=g3YYum&slMQ#l-"ɤOgEj/+瑠`P+T'QvGې ՋCn!}Wm߬NJ_quE%o :јlM dEmh\K Z!J́ cVg' !Wōy{8 ~?(a.:/˗VƬ nj~8H7C>iȴW"IǢ/v40ŧ1x6H5kMٛǓoKnyIױh4 n{mtDfCf aw->hkQt,s&V Cm69~B7a6XU:wʙG).Zź(uʆ6 &n@Ϲ|,P(WE?1cs`yIck0d%(6OxU*v/ prԐt(qnFcZb_黣j9wpsWF\dQJ1''cN1<0paBF)JZƘa0:&o}ĵ] oɕ+Ivd6=Bd&)ٚ}MQTS6`[bzwJ偂{:6c/'b+e*|p4.{Ub&%EdR H=gYpii'DNJcO"1.UR _d*Z,5i.e1 }eU\(d<*N*Kv'- )"'trcNlFvxs2WgJ`^ЉMA.(XRi7TeZt pyzr_aq؝>3Y>*ӏ&]s[_|49]('pV lߴ4litxz6Ut~O]jRJ='8K,;n+K۩3ϪAQ=`r ?7߿{瘨7ûp v`^NA=z 51;C͏Zu54 Ma^ e\[ƽK>-r)an- ҳǫ/ `Mc+5W=g L|T2WwqVIwpP_xpw87rR(y?f,ndlHAr;^qIGlqXJR,hB`HxrDDP大V84#N?*#Ia*dl:!#ޜ3 s(e /(*Hd!8P{N'vr}L|%Xw^έE' wy-Jޮ\wFWCx|ZHgo#rmD<+܅%[̿k|; #OAX8`w[:KƁܑ+ܑ<w0>x8UT=8\R@A)4Z8湵Th@S`1#QHYu2LwZZFL&Z܊1Fl:؈#2.$^DS5 d.).E;QeG@ omxGU[r&B(;(Y̏ZR) ejg.ӳ[\Ptz:yP4 PYjr15>3 o^Vh \Ss3J*UBLt4^fymsW ;oh\dK/o $U?m|/%;tU>K3OL3ie5Kn 7H J`c(1hfQN1~@'[NR&@6:G *hikæ"8;[ Ղr$-$p`1ޔ]\1H1 fzio;Ƴou.MKwY47UGrsKc+ߺMTȮ:HaIbS[1.&mci9Ρu:Zo[wZ2On;=/ytWZn?)o{w9wS/z{:.JD^ً(ux8`&zaq47Mw5olmD@UA2Wef\slS$8Z $hѻ!V0y4m #d皶R+c;G}.;P! o s Q[%GWHF[qܱK*7$!Dͬ fF oDՂZ[?k}NUn:=V)n[EĖPƽuH ÌМ:fٚ/_z,/]XY-`'YqZ6ST(tS;sSL W@LW C{˰qpUղfK5ƿp; E.s ];xx 9i=w$-v^KXUT h" 3yAS/R{#BK)$X UKpH# be}0z)#Re0H0<Hw :;Ύ>xWN]فr|^z"#,*@EN $ghbD ꤳA!`'xTθpB-{鈻uB3 ]L:vX)kjtk~ 0\\`~]]SC|TYlڈ=[FuT@Bʹd -HZx+HTi{|!屘I,xú d޴n-|\+u#z0M[a>6ԕmWX /rRHhsJJx(S,|V,[͞1|KqSPт+A!kU@B["#33c+/4".t w[zX)g2D'Hk5%yEd32DðqctvpXvL+w!6Vhq}Os=E-)_h:OČ'•[˒*kVb %(#@!bkÌg<_||}`Je\hwLil +Br`2&he) _3M99O''u_Zzq>3xx3)VJ1 k J%OVE CF*w@ 2NwTKb]-4+8$0tߤ 6nO#O\7\_ Nٸ8@V B8uAz#5c(*|,5*T=8߫:cQ`; nR/ ;xO^9"dQLV=AͳX8dqO1]W!}D -B)@)ӛ>2NAqV~t5z3W!Kr7Q?͇דlkiŴ^˔ל$ͻ~U 33όӔ֦ILhKŠJVg+)d7TJ#ܰ}n%o& Z_ɏi}wє' RjW lprgkjS]jGF/n;ަiE;NtduMcRѲ@%-:ڧ 5mf>?Jbq"6@x]<"[~^EB%\NHSy:bi(mat27Kf/)<(Mq6h<ʔeuw(~n@gͬHK|68*I ,z%"vA eZo[ōy4%Ǥd)"S%Q Hcv95q-3ty)Bܪ=u[wJ+md:ImZ70q3Xʏ(c)?eq& -DBMHf QxG$s?`|Xe#s2O>]=^U)gk,r^kC$X'D2)(ҩ8h/vHZrb7rK/>*#sٵy)~`. beer9 Ns,O'؁ cgmI 9Zr)# f_7ɯ9RC҃(Ug),x7'ҽ> 9t*.d(4a#A2?9HysQTPKʶɇ:{SIy~?Ue*Zô, } E釜nC?ɲo&|{o{ջg.Z ioǺ*dꪞ+жe1okrL]WW#c- <^T pVvJ{HLJc! բr,s2J,\̻RTbb,(3xn2ꂤI,EWl{)QzG{TqƤ@;epƲړ`8VZƝםqy:;$yŐHIF*%ͫP]Bbѹ%jL5*l#J{+㊣x;!}%MRhh6y:7 pP요37D`<"/^.jR+R3Eyb&zGpuX7ŷ^Qu6~}MP4N_}3!-F>n=-oebPr{='Ni[qr`8On9J)q(ɏ-֎㉵k.k DDJ=+ gk$#L,URvH(D RH#*d&Tl eZ30{-#cѱkeS#^/'01 aWo]mo7+ 'HZ*U|A_-`{ jd;"=3F4rIQwͮ!%h zc1i(&o)HK֤3̎8Ҹ1B.h6)+P2]IOD͊ &Q CeZ;ݱc͊Z!Kq+c2}cG+3lHqsMOCզdY뻾P@rY[XrPf5?LQ{V ~&,&qFGdUA5+38n]͹8/LUgO.אD;Kbt%[eMT0V1Rc[цsef GB!&[x֧O7dfyqr];^9/W}<8}<؟|[cV“Ȓ o| *Q'MU02tY( ).Ybm.LC#{1k0)i4ֱd1;:d`K*{l0`ǒMjNQdP9jBEĚqZ4dBFRU%"'O ! 2T5Mqbkrhϭ(9STplR jqS,hqWGzE6nV ¤[XktDqgV؀`F3nu(ܸud6;rb ZA%2.&QS:1lG-$vqj۲IkI]"ld'x-36" _06 *6XDȹ֦FVNաS]|18q,{w`nE6W635o7E?>Qx]1c+w]>^ĺNaby'OYEz 9 *u?l2#;xtI:a$%4+!5>vDc(!9 ڨ\-B4栭547brm騦\)py 9ڒ oFXˎ"r!uUN!a ScB%Cpge_,-j{O 2RhjV/ k OB_5}\Jy_g_{GzXt|aC#9k=l2b 2":]"ceɄ_S52}w\`n}(Vh0MgԻA=ލA; i6J_Ț'ӧjl=UZa&,n *Q,<3"5ƚ5VTMrDFZ]Ǖ R)FY ݟЛw~my쥵789|F%Ɵq޶y^|"if${b1?^8q{IE|t[wE~" ~Ns ?A7 UdE}1 bQvS #2%tpQ%@?pH~4pw){xr{x~Y֫f|:7>?[\sUcYʗEl2D HJ"-*L9R3X#RH`cLF`l J`%jUQ[E*؅nG|% x6;BO=[uWsEbbO,Iɝs/oH9bjψPM{Y[UcOC6<6.nPrлr(TcB>s(Zs, 1$ 3&Q1Ԑ#*`pƵ¹\Hg5=b^8ΞbZ $-l:@"()V5SR⭑B(K+:I:[uK'ۙw]+|˯ ;tşf>>_-ZiwzoWֽ.ϺA)Gjˋv3w|͋ z_xѽx:V~-y =n@X]@IY\.u7^pyN&lpwNv>oi{~îf3iqYPGiJ6z5Ds-NusAVI> Uy#3~ߖ]V [}<LF4|ٱ@Con.?/ uXCszKWOyfi1&«n ˮ=czNIxq[y1n vl*Jv˻ti{~"v9.D:)\[o)_Rr!<1;|T0#ʾ:5?D LaSB^v? X}Ŋ4/)0|[$E!@&b fǹyo@?S =~5\ )XL;ϲDjp]菍~Q:yfֆԃ.X?n51 Fr>9 jĨGBZ,䩢Q2|eIU>\S測3;n ՔKZc8Mg@ ATRNܠ)HW&t݂j]r%rT.!H% [FaSQ6kw֫fX t( g3Ub >u1!Eso! 2WNO:A:6UV=;Wwδ4{lXAÎrͶ;6rrXMjo{]x9ئ,ß}8?kqx 8Hh{|ǪE*[c~m<=\Vaѣ>}7E`>6Zۗgr`WÞ~ݺlugV<ʲSA.u٩j ؘ,7a!u8؁d緺>®9+eXHFூR͎N'|hZn޹7֟~zyy~o\}þ[?ē(=kEX,:wANG}A cAhPf˺D2Y[uulƒ'` #Dh6{OGQ;NG-r<n@%Įqf#Rp/[cZ|%S#y{M#?RN!̻%>yu˷ N|]R~om.zG¸*8j6RL. QV[!<0b`Mn5xbP"Eyt:h߶Xf]#xykS1/[F-lЫ߾5[J\#rzcZز*l p2!Yb49 }!GuyR1Uɿ{PN?u}଼,Bm2,Ie -Iy&5Ⓝ'Md.ō&vޕqd,0T! ug` iѡHle!IlZ@߫U^t\8$jFM.;*cQ7hEqqYks?.Cx;"qT!y8`R-vABHU: oN0ГS֦Ⱦ'[e+fsAh@7;Q)Tw`[9L m(8DS,:hO<`,wݩHGk3d_P-KzIZX'AHWI>H$!s6y"gdsgٴ/TO? x;0Cg)`ϲB %WI@4>XbufCmڿ݂g`@)rm&ȧoLyZYchAxt#Tלa9UU5C-0g%Y[n)OWzu<׬"fe$2x0)uỬ &z _ݠ:̟eAWu4|M'j)6gF\ zSk#ɃFcލqGIY{ɱ6$Qs-9˝Rp0%z_UկmQ?S)n[PnP%v/:OFNrϺ< K*)FB:(rF(ŎYm#mib bXpYq`Z>!xwsm9yðhf<ΣK09띢6Jh00h0i!`%x0bپO#2DJNHd 3âҌ;0V@flRt2yp6,3燙.5aĿzqzko^7/~W/^D]՛/`348gUPWǃ*xP_~U[ECyuT^-]ʕm6܇b+Z/ޤ䔮QStX5L%g~ b~cEbm0<`I|8Q~.6*xyĎGlqڣt/ 4T#"Yȅ 0D붆77j {.}CFnKkL"aQkG)K`ID֥T\sڜLn~sy~X4zN :Տ㹺?K? =<&Z<s@rH8+:h)(2ʻ_/d.94p Ï6:sC |޵o#rmD<+ϼPu{NivtяH>삡S!dT!CHb?sǩ [$$ V#AăRhpsk48T`@ B*Caj3jXWye4zl5ͭH$5-vUݴ#2~4IƽKL vȁMأf'k`;[0]EC̈́B/, ] ̜7#H `NZ'zMn@z>}n:с&9bkQUOESzTh_I̪n)I緾̩,0 t>@Tü.b_ v mHhLpµʇhZr,ߠ~d^7:,ѫkQTpY^ـxs|/ -&l lR:d5ݽ|79s~`:+!6tIGwN}8u\W|4M&Sr} cKإM]+j!nLڋ؜uW6?ϖ⡬dCRO2;-lgX؛EuԂj}"TVr^q> O7-ڼ˜۹BnɸKEtXѣn6nÏˏM[p>5Iw=gfI4 n-< \btñ0'ܾt B!m/tOD:BKy16Q_ܐNr7qhB熊Œ#miˍ#gemϺƔ20;¬Z⍨ZP6^s೦G*kuMGCJ9ō8`1waʽ3ø)a-֪ݗ' 4Gs}sqdfꓭ8.QpHOcTN tS[ySWLWW@LW&Jv:^=`0 uMf׀]1i>69jY!${ߛ^gAV"Ke^Q0;Jܧ[!~ h1F: 9#A2 \tȒushTUNL7P6d2 _AqR}(GϲlmB/߃_TuH,|y@Uou`^)rmyv' +5YFKRA'TZK[T𵘻-@L *WT&r;s]!̐:=O׻w}@zgqUHT"RcNYA%z|.R!ʠqITin=ˆ>Q*Ljd<)V2""&ZH0<)c"-{VFqh=PT%|MtBGy} }m-om!7i dbUN Q!rwNI 4L%x`aR,|F8FHsB@w9)UL.F NDEU!hm8W`DsRRąvtB8 yy>SCpR"8(,sPh6ZGNIy7_ yw50d>Bw@.5JU5mh6O?ΉYDXRwBbkYSe ׊l`eBQ"1$PI S<(xz;Ѩz:B9(qɣ111< yAɘ= #,4LJAkf5yJȜaH9|w "#hg''ᄄUh 6CC*(PԶe׳+X۽-6x*\[cl XG8RC :HoiE;@"ų}w2K.* 4fuVY̑͝ Io7-Cm$lWʵ`4< AVRffljfd^x&A!P% % \-UcYolA[@(U@$+@Mrʲ.MR/ާ!",,Vc]f8pPVZN۴d2K~5 Tɜ(v&V>eh3iM2ڊykYl{KYe2}j崳AZٙ1Mšnﹻ7LfK [٫o <(%a0t-gc2?GD@Y'(|lT67Wo3)CoZNH/>׵/ 3 7>vul :R?j%*_ziV%9yT\?Cx\/::KROTM3 ɗUa}YˠҢL%Q Q]U;+|vWiȷ!ܔ a:_sE+'{xW-嶋z|za=>I&Vq&{ZuSމ~Q e3PB#ש1ǯ\jp.b-{[1Z%Z@v4`6s+=f.銇D"2OXw!r1Zi޹ tm~RaZ{Q[+|x]VZƅ'>a{RbxHqQZ< *Ǣv%Q B.l!s@8ڠR{xx9C6{SŽDsO-?![~toLZ yb/i;cB!pbwulNX}Sr?SJD`H9e^c)Z" &:&IAN;{ %k+T3poUt55'{۩v}"ەRTkuPO)T} kTi =F5AӚDs9FQo-sS9zO&bui\hk~sa/c/Ϭrٻ6r$Wld=|݇8bY$mOdɱ+$[%ٱ)[N$rwbW,CXV6P4'p IrLJ:­ reAm )q4qpC6 34z<~Stp50LmisO41rgs k&RG54K']27ᔇhΉ{M%A{76ᵰJJ2 4\&LK" L,+%#70fgRpOܜ;Th\ αe\ɹ->rrİ:>ڒ˱ܠAo\mV1K[V886tD(K9\{tCAL8J^A%OɩQGO rچ2 XE!< JYh""UIGx^&ePT1Ulctr6]]|ؘf =b.i"HJ%{HN(}1Fjz>u`] ƣ5sHÆK/oro]RL?\UKq./ALtb";;s_AJ Cj.zk \(Ct]kBV_-m+̺577e|Y%\T_>yzh?{ovjlҪ:`\hVK8hx6xt2$Z2a+%gZ̾Tll+6Hoޟ jKu,|>A5s *!+eގMh8͕`NЙ|J|Eb\>ǥL2љEZ P;Zs5dSG=^$> !zB/IZ%, 'm)"g-xD9B5y@vT@V|rL2:akj펚·6\NĠ"׳ɦP2e'ʐ+ч;  ްgo #c0,BD!SHfF؀Q)U#lč H%΄Ж֚P I$bDXC!o8ѫ }f%r\-. h:Ddpu Tu>c#+"{( B6w'ֲ}~p!EۙJ0F:,b,bHzC\~ܖvR ŻYT>Nѧi^/zEAEזBUwZLTuM(]fGPzttv&UUjoS ʜJH\ Q\%58:sQ$с"fr`NXL"K186ְ䙛[}i$dFan*/7d\Ƞ.2kz4>fƞH칒;%#GJF)mP22>Dz}B/.$wCLG+mT,kmꖲR[Kΐa ht>jO8޸ Y)YG U>E5[c$b:30f8/jAƣ({I,[U\S^v+݇O>}HY*Bc1ݿ~ܽyoL]~<9[|K7-gBwqMG)u|ߺ;{u~]_/KK^^]K_v ӳ%@Qujή=gqӍ"SanWw;HO)wz1t0)ijU7ûQI*N%JJNus!i'*nHB$n>}>x^jc*A9A1գs|,PT˷_&a־w :B ecx23dξbtii<[`3vp2*P~:Ma`zA| \??cN>(p =d3kSpkNXˆǯdf'n^m]i)!G~{:! ߧ%{}izWwu>EˆNϥN@zx0d Tjc˸2nqqOD^uEc) :`X+-P(2W|o^v?UsѬ`YԽfeYAQsߌ7ͅpEY`Z]KߵHw 5^5o9"dl 9s Npi]@O !iͰ8pIkMB4DhT1#Q'adc)hYV18qРE_P:/B &bbŕY ,Rf:BĖɞ$T*rN@{UOKL*%!gg2Wl&NIXf\%ìc{cM;}kVY۲VhuOY3Uۭ EY^| r4VVQ^=X+zYгLt-I!>}cG)! G F+G-1}X["[ B)Ĭ= dB\sYreR*~vz-';\?6Dǀip0bƼR94ߚ[({}T d'.s/> l`mhiX^ Wz |(g +e:X&yEt\V:ƵY#'f7Y7-,7yLZ`27g}}P64Wȼۖ07DP׮ᅔѧ2IDK! et^ zj8TnG)P(e<3(P`2*+Q צ71e( *~O3~.pssӾd>gS򌋿YϐM A̒WzxLC|z۱?zTz`3~/iD ~6F%*}adѬT200c|8O'UΪ=6!jz(" dN<bp)g#{&D˂BPuM;trk\Mٴ瞭5zN1%W4f4(U d /#~Æ),ݴ wi_;qpgށ'RI SZhb߻>|Y-YGFjld%\~-_.Q2hhx"g(DQ*鰡q &|Nh_nګ%w mS]U]n~A߼XqsO&/tjHu29'Wt5!$Ni7so]kb r͋397^^]3ZhmҴYH "a[$lգruضHg ~E¶H "a[$lm-E,,- ϛJGSh*Mt4T:JGSh*Mt4T:JGSh*MAt4cMt4 V 7Vt4T:JGS9p`YShht4T:JGSh*?"SlGtpOǟ3Z^"{ЌKއTo[NmsY}¿g[OAT4ٝ5`t6pD@N.t2ک4ʨBډ٪}&%V ٻ6$Ug7ԏc:a.'P$!(W=!ǐ4h{ 5Ɦ_WQcp@ :g(@,J-Jj[#Tɞ}=d'9pS`IgW ̷ 5Zr(gi'\JQp! ݐw'UvJ57B^R'#7Z lNVhR/ү6a5]T|'h(}tls;ϿD(Y$hl_Q6Cm\dZ*j"Y7m,*Q8FZ>*jE lt44kΗrz3'8r:q]l:Z +m]^մjuݽ^G+ꪚNjWv/:Y#Jew.ilo"VMQtkXH9٭u5# SغVuq9'ՎNWmGnjb[ γ=o9AwyTj}7K8ȳ1-ko ⺻[lS^o󜧌ǟŐM9I,C1uzQ$ Eo^p &Rn, %x$CPbA$D A<E~d%-|$4ZWg}$y 7>׎;9u՛Og 2 ?,EҸ:Cf2,)sh*ؐc1;(^Aqmo]8" 05q&rhq4&3! Foi/"-.#1Ezc YZ>VL*4J[BxHT!%$5\јӼģ?Xc qCC,K $9 .96)&K6$2$ `0Tiꗯ*(_WrJRt Hgl_S:aI|mspThNU4E=8G]L ǀ\4 3*t!/dJ!:=;x-z tw-Ad91FD5wi $K! 8+B$tZ#%Jo&q)%x4SURaF\.h(@rW/YdF˜BCºkBHϽZYLDJR!؂1V9$.E(pzhN'ƌ)T8>ɳ gIKxTpQ0 +HȬf)|pXiHGւh©hZ&KItDę{FŐܙ?Ϭ_O HBX]G&8GBn1eH8Y&-$yBp-jsB9aG3rtp(G_UsJGqlY$Zr%u@GAQbEnBq:48g)'ajC -&:dtK*ffP_j:KYs)|\ZmwMMVS.PyNk& -ГڜŲyw:n7y󋷛d hƣr^ۍżNV;hHwdXH k9`sMfEi4rV&t2jZ9ͷLN@>j;ɮQ;3$WFrHX |7Ei*rhb̮tc*;l4Y>.hvۿwN)ӧo}{+0>$A?? { kzC0fW9qo \{#[kqIZAoc|5)b>P+H6Etq'U(Ab'EzE{_n/I3 I&hѢu/-"UEDx&-~by?='+aC_UbڡX^VSa<琄 I TQ+3rsPx3>t瓇ܘ*[uj t *t=|uYۭ&2<81= \Xd%pJ^kfE4 2o@nY&)Y  EwO$с>iSVu ŸY{wS+sU!~\)mtt0ɤzwä 2$1k/wkRy $~VЯZ芥@jW] FcOet|q9ⓓio|%hYe3 *[WB;*׵4[gӏ|4(<"-%9c-5ϳ?zyZj+}RkM'2F L:F\p$ &4ǵqɓ'cr(󧕏uKnTȏynQ7cޗM&kɯ\ڗCGʩ$QEQKe8T$0R h)"ESPpi4BC99(/I1N[#sXK(b"F 8((I+GД,YֳCoQͲC4~縻O~+~W񰻻{uMnr5f-|c!vןY$R}RQ WYa4mDm%A"I,rJ3p( V l "^FQߞ&-?=K}(,MSVSKSU.끠D &nYsV5P*xEM1(=\=]GyE@| !.Fy/in2H_q,t|~܄-b6ȕ%f ]i5CLv1*~:imϓ(D4jJZ*~1_g"ݞૉ4;n:j &l٦ζI95-ca47WPBcF-ꥱXbAj7yݿ4y/#)?qI !ͤmF̍]6ߑkD]5f6\y ߺ41;AzYV/6sso滰MDcq`IfG MV#7FO7qzFmon6TW{656 }_wb.mut:)ϔ(F۝[Osݖ "4wY)')OThΝ?1 BQrn;6T;f2rwM'&OF9{{cV{ oc%"c{eeo7Np3}D0C#4ʵ;'z7el MFc)wD'%#h9(!HQ $ac0DZz W 2sn ը*$Q&r! dtF+"I@\mĀJIe<7ƃ\j7?.Nv˲?^@hR@|0aYua{*9mgk`(*$@G^Yg!Wt '-$ 8tUs_+ɐsB-&Aj%RpE$<Y`*6kuه)h@ƃ94Ȍ$|ݑU J9ljY jcKPeS1PR@F60ҢIjmƆڬ9 59ai>Y3sχlH,U ԷnMN/?/4S6H.fxL^,]V6wrkN8zhodsmꕷoHū`CxJmQPHEk}L0tm GXJJQxyTYgK% [gN]NVZ4:vR=clJ;5V#cW_($‡^eJ^|xb |.&͝`rv~{lI2O20"c@ 1*PZ7VLAA2CkǶ5P^L "(nSYGVA`N- Y[(A}n֜gtb m5:k;okӶB!J< 9D Țk*$m$tms.z#53dF&ʎ}Mʂ,dTG񁈩$Iuv16͚.H:&:CǮ{D{{ `&D4SN  ҁ57S  Tm%ih%CN4g0!?ap4h(Nj$#6kvxyU#!u6%E/_ XSCf" E3r* (?*_,N;cŃFǡC?|tڲbܺ,qie"Ap=Gxя~7l:|HwaO_;Nd%vNBiJídg)&eI:Vši={Sġ%Ty[1^/0vDY7-Qn1SJ(]L@֪T  *.̉%لO-ݞAy`&sfOFyTs *DǛNT{LVB!0Y2[p3y<^ {M;qziܡ7ؖͬ;3 $wLI'!UE,jM=Z*S7BȵEF3HQHVwNƜ%HJ;8}6e0DeIAk͚@pM&Ǩ۽۵%o|b>}zW$%7s}ǫ[?d14bR'!lB6b-Qb2BV975K .Fd{1|}(Ô y6e2:'Ys@oxR7C2K@9~l` E@ń'SωhT@LlZ:Յa<~>DEh;7bL7(42jHH_/! ك37IY>†!_WWFBlv8 8n⸺_zh=ܸ w3,Y޳<0gWr-{|h:R<:~<=6GUO rT#*#$p):8h$T<" `Qh92HUnax^ePPUe-q圿Ⱥl~rSc9 e0EH b!ymk%T*m@JN[~1H|E2ވnMw~BziP`%O'S;r߰ +urQW0 F]+uiIi1]w`QW0 F]+u`ԫzv`QW0 F]+uQW0 F]{Į`QW0~+u`QW0 F&Q`QW0 F]+u`QW0z`QW0 F]Kt`ԍ+u`QW0 F]/=œU`X"]+u`::Z#B`QG]+u`QW0 F]7A+4/^H̜|+ʧd"xdRu vDh<(bLg1?g}8" ' gm.&{ z&u  8xѺZQ1"\ i[ JH*ĜD1[(Zڢ( :u}Mz#oxB;b]޾g{>{y xX/C@LO/^鄧gp*+y[,a$akcnT$?;Za=7E9 6'go?c/v{p:.Ai.N\[|JtBdH%#H8?Oor'<; Ͽ/0Ee yzsWuklOr>dἦ=8MBnNodF|y˚ T2wK{]̸qJ.G"N9g,&H?]:z1juu?_Gwwfl+]t  Xn]]vpIZ";=#7GҳsSHPByt&ޘŽ Q)N>/WIW?(@`W_Pk>cEjx-G}}{ÃG0͗ ~.f,_Z~O僬TnuD neNk=miZt!&I'ՑepsWM|dbrOfXӞʔrg2 ?6בj&@& f6 (~~/>Usݬ},^weYAM0QR/vM^97(qĦ^!5=V[':o}A2G@q_PFAk>Tu8 ru5ڕUR[  Z8Dj!@$D \02AbRx/6Bv데ELҗHzp5>$0'fKޕl 2s*$S`arger$ș"$|b![I *96v@6֫ܞ{A;VݻgoiwOkz{vQ7c_LILޫQG#FQyBz\POUpA=udjV2&Xc2d5s.9 %YOLuCZ+mF%$:t7>ҽ>yd]l՜QS? ݓ`>{>VZ }x:&y]oj<{A i7} ae$ˆXTAi@}GC߁ ^tv9ݰZ_o;T(v9<KpSO˿jІfmЮPقVvf|,bqrnC< 1AKt( #4ՒJBtv| gt>E?Qq*Ͳ k}=UU̅~Pìkh}=/VPQb> vqfQHV{Y! ?!}ҭZ%9oɘTDWoh(yUǃpH_ ?jV>~v$=5j= Ua+{mux)?>뚅.yy~OJƶFx,vkj^2#={t7W]UxnEȹY{vK<ۛȭUsdR.l6kw{x$-l]bzuj{\Yuw.Ӊ۾uزĖe[vγ=oܾr:ڧsgϣ@t-W_qM+9y6f\ޚۦ:RM7wۼh˟Nſl/Mb%~JR>E*뵷e2 PIJ΢*5PZ&q(]nrcvq:l@  \n\+NK`NR)wF70dNDbs5C?Bbx.7,5hڑH\ Ot42Gָ-yr+dg[oQq%쑰٢U6s|]Y V\]v*-%TQMAM]5s=UbF}uUx>;okSj.MDҤ$Kr^(WJm&L-656E}T5i$BI/Hrj#E- $NaAF$$1de' كs>KtdUM8NyDECOQ4gZ45JS#6CObzP`J'.H!8g-{C(E(Z>d4pIb<匈 #$HIdLP uN*d*tR+9>H+$htѠ T !!J8:9,CdA^ 2IpF*'dP`}^Q>]s!F!IQ9IGF'j$Ѥ(4{\)Qb8^ ]]99,LNQ>j;ɮQ;3$WGrHXn~sU,ȮCr@*οfTv*qO荒l)n?߾o>_o?㇯2\d$w?<`h8^;4|7dAn|qUS.~˵9복f Ǝ.~z'B'N[#u ,$5qT%neBh4iRoTxʐULH` pP*9WG#*hXO{ GBTˆ$Ȁ$~簻 M~jtX2jف-v׻(o2e|feƈOWaHKED KV$&RjiddK9g,x0*g<-$(s\( kCK1xNԌ(5'-DV? j}(YEԃ-M`[p"{җ♁F爭禬+]9D $fi  i+o4^)G1p1{o&$mOx9LE xIb#E姱l +\|MgiF + v.?5USndŭ(< R:8긟y,h@Www'"z"ɸ .WG`'QS9P\(jwoۋY95D+bau|:W2Cm mŷFC;s8RG~Y6~#{X' WC'BI m,kHX/9uZ+Z_Y|=l]Ble!,G>0.0^Ӭ[*<w%9)9QrAb)$G;C"x^{qH3I~6 ȎjZ0A 1oc>6UE6j@ Xwӯ;eàD=44+מh}n?xVk<5$RcshW(#x[ TQGOK/j` !z3JxTږfOOϞ˪&YǹuFJ2r]:^|ƭX.vKQfKɹ/gkGS&0.J-*9X_Sd 'fL0LQ҂ ;JZСEITz$-G $ʅEC.@]6 ?y:4>GlA0>,aKz&^ܞ|Fgʕ$RyvY0DIK%4:6Q\IFoG/Jj.`+>kvz=NnخvXNAhu^TvvSՅMD٪|{L]koG+>zz0Ic &v/,z:eQGƞ9դ(YVKND"Fl-V9uWpӸ˓z=Kǧ҉e2ax[ٷ ̚ow̏ &`ś`sհBbT㒸E?ﯗ/\lq ?\/\WU&XuOFjGXT~~1Z׵]XE(~bU{_ zEC[ڻLlx3JGdboX59wS'W[mӓ_ɟ$Ƴ:Y' Z4P vCJVDn/J29<]pe*P%C2) UU} $ظSD&)ni3rN%J'-t>shDOSO4O6rCcsAr `={HlWHM_'w8q`ICFH{h\?~6.c?iȹ;6?4g`4s0v20%oqr0,ޗ;Xi?=V::<ݙ>%S#d`nqF`;۸BxTFmY~ ^VB^>@$e' ɳV2xeB?w!J[tq Į /u>Z]W0Cj8㥿b3W7~>o\׽̕~ekr?+ co>~7lf\)Dexa</tП+298%g I}}kW% e5ɛiyv%Fῷww\cj4Qzߣtvuͳ?ONQϠy_ս2@]t&Ux쒻xr񲫚nrvu#^ 1ٯy<..`mߠn݇D+Mr<ķ-k~F)')=W-Rxo n,Λ½XfC(^{-c㞮GXO#@!{L /WOͷ)qGNv_NFEVƷ*%YT6Z*2[Ӭf5-]T_?}pf}w=o/ܟkRE&'RF⼶l22 Ib/~_ *I_(!9)1k-b eq1rѪodQD1蘭li0)?NʓZWk|?sYZ SpFDU* פnEƖb1d]RvZL!h5EǩXj͘\5#۽f*LF' )SS%*D 9X $3ؔ LsSVDGr?чSºlmh`@jY_ik,CRBT-ʆiN~KUhRѤ CdJ Bm zrUFҚw-"91[ܯ/5Bb zr&@_Xw$鐅w>fiV: U2hOBUosܸusٳ*K'jjdU9挑s( -^4kQZ)*!wfuQ29a4n );Qƪ)L>"~\+iY)kQ.96ewvFP+Q49KBL TeʦJ&E>I+}1뜵'Q\2cI&l.-6JR scԨ%gE2it:RȦB@*A+ݥ$з*L%Dt Ѹ hސ J9zF٭(Pd@=p}%Cۆuqx(f $JR-(jlO%̣`,S̍KDh#4e,qV0l őjJʾo@0iDyXaҼ1k Ki)9TUFdq{X EW F#آyX|.bnNc u z)ZpUTIXI2,#<*fЄW&W sbzT7[SRtX,ꈝ50MCDa ʮ r9@g<"X6,3Ф~ζZ*7AS1Dr4GSF4h֙%P 0}RQ@HfdW6;'d2Z#1YQ@IFQeƾ%t0Kէ :U KuZ53X(Vs e1=uڰ NXkuڄ@<[g#aѺYLKQDw4' cDC; B LD}0,bg;66[#^ﺷ>F nPAӶdd=2ȻP ئL15WR4]#mJF%H-GKg008xfgܴƁ|ѬƂB'ązPKҨh)dBUY4VLtaXrѼu />$/e'76t l*^zX:7+@JTU UC@Vɒ3L)icU0~ v?n7gL^LYy\E4Χ,}'XR oF"G]Jj Q @r!PƖt3 >`,C)9k0 6#-h';0]xdWAfVh"Rꆠ%2:~o-gE9)1n>#F@bsI.c5HwѩzPBJp AS'}rc X߼]aӫU|?`+BJ1$'WW$#7h.# *=ԟP(iQ0RSd: H 3ETXT L@ أtF$ AT80v*`cM#Ҁc B$ge$]&C` ZPY[B):9=VKyAVA[c=bf1(TB퇿4f!P 6J fxX@¬ Vʃ,.&E=ʨs!1>?M(h ANr|dNr9SXg7v֘%QRvD$bp(h2.#I!:N.P,=wSĕ-0 ]eyS;ˏ]KFQ֡"`B?`j?ʭzЯOf-nZ./o! 2d$&)8ZAoJ[lf&@@CD_O:ͩ~Jcsg>Hv0. zI p@f;%+`1@J@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JV 9 %P7kLnO] +Ԭ@ @b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JX @ !)`F kzJ {+G%Ub%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J "Eu&p8J (kfPJmY =*Gc%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VQ}ѥNӾԴz\^_o/_/ ;Z.Vyyuzy\RBѵ1W'>-F"0Fٲ"uqiy|6\i_&mC*/O/'gKjMM"j%wx[ٷ{ LK; Ƿ|b3_k~7\,sLkߟpy~vWDžx[/]``6oweHR \|<ͽZgwggi$ e`LU̘\q^o6=3 Mf߯}}u⸉(ڡ~y:䷳Jq:G^+7a32j -!{B_V}Mnܩb"@[78S@@:6x1OJ-ǽcw~ݜl`mJ]Ϛ-yy Y? Wg) z}؍WtXy8aZx8_!v~vŹH b B?I(Cry!q5hdꯪ5.f4gqG^AzJ`M&^t"+.ӆi(׋mge<۰<6Z <$FϑOq7Eӑ/8u⨋ZHfw=}ߏf (P:?멨kP5~1&AEEg֜) 'qC؂s))Ӑ¦=בH+}D{a*N->'>39)ҼTK한{GpS%ρ-fFնxj.J f٪h꥗WIx?:}Y}:)bPkKM7f_W2(,rFjr:Da,ޫ|+ы#jӑJG}dIwkDKE ( 7N0,O؉TN)q&r0G.x"a>XiE*bZ-Fb -?0$TuntjӲ;$!)3MdFYTxDyk'j+},*BǢ:¢r3!׆Q=BY̑@BB"D,4“N]e:t%zL‚GJ>tEuWahFǏi" էGb^QZ]VU\8XPQP  0*B1::XGK[IKb(wrl^Xmb^\~10+M-i[nj46)[Ygaa4.`ŴGzg6|s5wݭ֝lj*ePHxZYBo{5gzYTu~x_ee|sd7?zuw_^7]| uu^27ypIP&G#\UWMC{uTMz] xve]nhprm-ޏ?^7̍z?؆HULҾtk} b~]Z@0@KC5R*F.GL GO$ZNZቓeqHұCI26`Mq3\;JY#%^#*i_} }'>Cvw>0;^fلUНg 69gXH9ψ@(cF3r];Wp Ӵop2eH#s9S}lHDO8*#yz8Ute0 VGJ'\1 Z'>LD` !^o6|Dzo,\a)V}?%;/Uj(&|nBz^O8a`etzk @_ `QTD0ë_˸Q6e+YsxPuy=UV-\u R-XvAv~BRͧG73]i\~Y׺EKi8]~d#KnzZV&CZ iKNfc„&&5N@F9x[zDC3(]?һT,,*3Vھ]{UGfިy=kN~;꫟ů܆A!sz꞊ OĶJrxC>o/tAPwl;,Y~]2' ,iNجt);V43Slvw3" ASR;$ۺi{&Oykp 3o-/̜ɢA.7£ M1d"jkȝ\nOrrc\nؓ{K-C YA#j ވ*<`54Hbt4D{S(sVA-m=3{9A ;vuFmJ/M\Rzl/]Xϝ=ǻODM =! SկbO0 ` bNx*C#}4ms! nm1iބ6Y-!Ϭ>zQVp{{':9ص=Jt_ts˒mKl)wˆo }mKP Cvm \9x4kh~԰h߶?|x1Նxmx٤9ޫ98Psق$5rGktTG܏pw c1r5kW.bG4^5E?M◍LS30O\ 2-t @@+gOLS}IJa`fOLBebIY,@OӚ&T)4 PlF-c-+2V5%`VI*Cs2cR]Ńy[wʚAƒqꡩF^9^xViNdӽj GjRL ǂ ɼQ0'1n!ia>; Eu=gx`6s+=f!,^9?{WFJcq@f06ߊ/9aov>b d}%%0}-ɖKrdj'$rwfW,֣u0d%U z1Cf{顬o5{+qac "y(=_' ʏTҫϵ#z1^8FQZ)lK:B6bېRnE9/#wC}풆~zY;3gRb!9u"od;}"\b`?m[^~L\3&g,7.k V€xYth9/"471h\ul49Z,/P7ф{UȂ؛n-w=wu@a0'OקN 5ĜNRQVga4͌p|n]|n=۲yr;lj\^ST"2!XV6P4' tB:'DR@9­@_0+3M{JF seγ$R˨nH[~0_EZQM@^p}\Ea39r3jJr6D '1ZdJU%J6)PxFcX睴ilBJ{-&QRZ&,H b0.7BhOӑ <_w v5r9QFubgi~N̔A/J)Cӭ1%Q(err*jxWm>hHf5$"#Z-) -N^7Xրohܿʌ+_CjF2y73 kLIӬ6urw/nzwxz\`~+C-wVGԕ?.M{_JQyIy,Xt$H ɚ*ۤG 0L Tu59iTI%hyg6gM8jGjaje yOʭ2(7,0Ov M?Mqp/hh40ξ-Tē@!+c#+BEK"d !bopb-K2uCD.QXÎ$/ܜ{nAO}|4h s$ ,pvK +)!R%"2m˲̚bg3cGR22=W2))Ah!U R(lĬ I*Ape-F*V=ڲ^qH#F q嘒 8Rwf-<_9꿵>;M"#9YV "΋iZ҇Ģ(K^ 6ɕtk5Sjl#/2}i.RWithѴ?iyǓ_~hL=_k['tUqzUΌY^ݻ׫5ի1a;.pUU.*yKE7x)B:䓿}Anw[ ^yLov}Ft=uCxb4lhx3*iPɰCoS\sFHyڨ溠JDZ钐Di3F&KN,yt80lBcLp;=i>&8&zvMϘoFMn.恺\,.|/A u 0o߯ :ۦ eF8}{ps1sd'}R~^u-;1Od TNt.tF|$\8WcN>p =d3kSp68s .E_KK 1w=p2f>t]DX[њ{~7˟V7~(/ysQ>߂lY))ރB޽M]2%I*1e7uո8{"F"ֱ%) :` XkP(\2W|g;޺t7`[*l1XuYgYtVBk)_24YJeMi}UwS/}"=?b<~e׼@ @OI`)*$W֕L^n $HZ3Tx~psO!chT1#Q D 8&3RAѐY)4hїX(h!tӄR`i|qe6{DVRf:BĖɞؔHĩ rv:PZ_ "IX989pg]8L^$7>:byrAT FF >#CiЗ.*{[vսq^/xq5+ Ӌ/Vp^aE/ nU6 Z@CWPnHR||.y>}FI cC1ey>>i%9 fQ:O(%: yZY`aa2V֜p~QAnፏ)Yl-bSws8Ã}}(ip0rƢR9ؚZJeTڳNU@C*5Un2?L@4/R^2lB =-Q2͓ .ȧs@ 2Z FOT@JrӛǤ9 +SpcOJІf,m5k#NTzq<=^H/3D?z dVE۠MlAd&aqߧtSa=RaPe<3(Q`(T V צ7Ř2 "Ä_ ?Zӌātܴ-Y=v )y_ȬK1@#ALm^<ؼqzbl<|*[c] \6۲KdZ19~FPD!hr_w|&*Q,0&/)"D*[~<3( B1JLC Ey府vˬII“MX^[ N*f{<~H5ŗ$42Q:m{,@>B4󽄌I ,Q-Q*&_ŎFڒ0{^;> 6̷ݏ iRj>{w_U|jh q?)Rp~~wލZK|vYT*J0̨_$gQ}-MN>SP?%,a*nVm]*lԓ&|B fLVd-ߜ'z =MU^TQ<ׂIE5"4M3ց/ҩsldP~e`Lһk>Ұ|tfCv7&x^Lwsk<30v!VldfA;[NsQV8 i1̫g9B"?;߾#Tc[AON.M}-aQL@e~oxc<[I|3٢߹´uR ϷO?a3%}%ep(*8Drday&Jn*L5sCˏ_ڢܜi0NNAwqVL$Hۛ.] r׏ uK{(p!1R)Y6dXHЫqBcD?fN;P[ceDyv.EvM2cڈ 2Q7􅝢AL8[6orl%XQx\HOL u3OJoN[z3 B?%Nls-i tb11n3n&iAbB]ds%5[=9tXa4e7,ybִ̤k{p6$ m |[ϸBpoH%Lm3 7t39@p)UG:su454ERYRlW@{ (`A2` h΄/mHNxz.5J0aVSg^눃ѴMSX~+Ê;qV_0|j6oGy.D>CryQ$({ 2qH#_:j q\~(oҶ$s|7yYQ'>ԡVTlJf\r]rqګ4iQBڗ$,#gŭ 1$-0";_GL:!&a%lFHR)ok9dsCK`:!,EN_#Q6A_> %!m K퍠aa Wգ1[0y>3/> E*k" ifr˘׌tBlyGc&%nT2]Cy0 oE#bQ`͟y9RDjk|*KS*+ MW!@u 6uNw}cR 4pO ôKDb&۬3P2,qcL$D. qvM׫_ISp~ݧ/LMON~Zz07,3HJ-PYEXU,yFdbZJ3+Dx!/fe.Hr9mYQUfi~;jjaFP9l̋<e3 )r&%ot`hbG %JuZRl Vm͒;O?gmct?xoFX~De:~m x|==29(7Wx2 +*MW4"tcL^bm=,3VI4_J̼Yf;ظ>m dnތ &K| aV K+\Bf( TeRc KD,bj7'`sOl~5R'kMWfWI62foʜ 鸪 MAV<ǛZv?>ǻ\juYITTOiR*AkWOVO 8B!AhDQYXfjAu^eL,B/G1LKjc ^B&seV"]Ƌ/E >/fYzcT]#-cLFu5%YkO6IU< 82AD$w7 G[LD~h GYqZƴz9!kh.'*;7K M~帷u8RzocFPW:Q%~RK ȃ k1"x\͊kp];&{ <Փ/C2cI͘D?f(ƸdU=6 trx.CK󁩹$*^{&VWZaűfxiM~?f][wر0Fe4_wхR明'cBW>}>)[Awdj _yڲirMc uz.Er"%aax*srēQj}܅/*!_3_u܇(F57`q]rU YX.!9UhBTVdPVà(ЀMThZ)P fg<ŝ;!Zb] i6o<3yL ckBZ[ ==.A $Y"LSH32a:&t$N9|p5iAh!&Riq%nL%cP\n/a0ı@́ 1 2h3v yAVb,K0Nr(/k9vUc!e;׌F+Ŭ4( Sx>kNؘ-KޘAz^R,V\t{NnjR0pXmnHF8 k;_ m YD!iIY:!N-(o3S+̴8bdJqOX8H꛱E܍B49`i8ɘu[(I̬0,AHŭ&c8a6B0ŁĦ4sƱfLǠ kf-^"&wxI&@wqqAFh^C5#\* +lV/g,Z^8f8-'vA)U-B}0cKT|DBnh뙩 `m vs!"RDӆmz+YxLgM kXo |SgЉ~:Y7nVJ%(R/ܜ~ZDNeӘ\#I@q_H 5'4O\Z6u{ vnT B$UR\Smr(9J $r^O I]0,pNJc`/ J]baK1c_xopNz=S([%) 8DY4V!>^;\^̰wx~e0{)e/!Ysd&mzL mVXHa{1ƃ:|a#B3KhBp}`oߢc(6F 26kmV?q 3DYK_x1zXɁX^| *Of-mRVe>F*/ ~܅zwo љU@?dwj2q&Ooɳ|2ĂnIf;~jE=W9"^@ڲڷꊷ7z=&;n='md<S`blrCZ0"7tceGHjF6rb/Me}ABf0FYFBbyoAN"zng~s~S<9^-r('o6)|}ȉ,(Ѽ q#HB ~v}fáHէմ :Rxjo;〤;.SL4;6ަ#\pcK.7/Fn|ڃ x˧k[ǏUՉj~ޘnƓYWloZ9̞:\˃O| kK.ʅ+;Q1rO[d|?k_Ԛd&ǻHbdV=*lV~7Nn@>Lq)<1@)Jc~ɍav7Yy &yC >mٺ%[/[%XYUb{mJSO ?]+(eD|xt4  '.gy\x>M9s ;#R)iuoNdSN\V2ꍊT fP0HTyBܗ+N*gT0ft 4ӣ! ~ց2€ n{Ԫ*c&"gL!2 2ۺYSU 3b^Ei$s_lB }}w\*fܨ0q}|$:(fX aM6V !"Pj~xOa ՎR4u5O} bˮ3"X&!T$SBE8V&2Li tiNBȽVlا܉OB;XG7'=x}+c=6r7R]QpoT9_hOIjR6fݶTRZ cͤ9w彖/s$To%]lu5Z7HTKY"}8QBg@r Os'xl(6GcC <%jiGTۯXoSVJ@  1CM@Ԣ>DIpXu4WJɺ7Ӯ݀ƙp"FkXyX 6_ T8};eJU6푣 *],^83vʮCIв$c&}eZIw<PbXHal NSm 7Z ^gU'e (U " *ջCL집*Z2]*6cId8o OG2["ɉMX[ fb gF:˖DNK6 {n,w1|2 xs], 54zNN#:}S): !XO-,1N1R2ρ+I.I1DwЁ@@q(*%R 4J^8_N@U#c8>4cU?:<$ TA[jll3?gS7"k%“-l1k@0U%F]yIreytJMeZ\Ll>um\L@3"?XϑZsOQq ʪ-ڵ("IyXg9d*%y'Q:>0I=Xɘ(jbBr$SGճ^mxaDh鯑4Ϟnjc:\Q6y>Z $_𲚨ŠUD;/9`f*42!Jh484(tj@uX EAs Tn\E_(5GfOډͤ9JT7Zp2̅98{%Dى*qJi PD1%^xfFģI(<aJCJD^k^j= ,:p?F*XsCh|E)\hXI%ozZ6+h.~vi)n!?8uMmω ڝXe+d=ñFh#a0Xa ^dI*o_L".ר:(VX|zG?~_{ɠZtX̋ٯjCBDҌ# :A\g %XIDYaeRH& ^24Ob]oiꑑ' Kt4JKQԬqrP"t_Ɠxv"5l^r Rr]g`Vmǣ~/c0{}MxD ,-DY5k(!^R4WDt~#,]0G*)%,3JAyrY c9 _W~ۇb劚7ibN#ͽ <ӄ 8Gg-8g~ӣ3k`ڼnr% ] e$oQ>spϽ b#F46F)`o;I1`a=RƘL .Ia5/J4DL$"EBw/i)Hdo/bڃh^`R$ j93o;ދ(*ƯJ}*ȕu .͵ oʣVneXط2dS-V-}Hsv[VN];nıSp "켺x.&MFa. "n+4./EK,noXr*&,w_2Ё߁hq{H]2B'KuSV2raJڟ8p|r95!.iI/mw" B inV+jK:pb'ѢWhӾo.<@&TJe^0x_F9lJs^Ø2 @6N7#K;k{\yџTT)uŒaC"Xg˓YL_&@DPs'#qVBJ\h猴I7:! sQtǐY1TJtt2A07|9{=%mqBqgpi 69 $HeWh#\l^^E !V-ɍeŹBPqM 8,&Ue&ƺb-KȜV+ }V?.k펨GeR0c,xA#5>J6PƷf cf9RFc9&![}rA@KJ}m{8BE sVO>+UPˣmsGKs|چHݓX,^ĞX8lb7#G0tƜ(=Dl_o+ɼ~Ga3@RJ(t2q[Vb=T:wjYeVʴ6/p-|V,)]TR :̞۪ }=\')z^*R tS7G%]y ʭ w\'2aQrn\N&t0dP7:dlD,l%F,T)v9Er8@`tsI*[yH$;T 0ވRӜIx" 8O 16vj<$sf5cWoUjcLnZkq|RrZvxeSKO.RΪ(J{L#}LNX%qi Ac>3oEiI` cc 41s.)cQFH`gW{T\r8-jGӹ<~xx S 0Kq;VÚgqAN`/DԪ ./d]^ 5demp6]}`_۪ܱJzSAxA!* +@q3?7w}T>^cE.$5㶯( c{~Pnhx]6h>`pBkxǭK]5g^GN6Oͫ{f3(_9 ː!Ո Ewu]'{ա5yZՕXs!pvudOWFАZnX۵>? %4m*@-]_ᳱPť>=Х!Zn?֘>#x-ߎՕ@"9󮠺<+&} L7L Tq1Ƙ|wzÇ QcXP,B %RݶRk-G$~»;t{l"&B&]ɷ=54X 8_NƆx5p> [?uy;cBA:^{<0%ka߃\H͙b[i~}ߪt3k?Y7o˽{2QNG j_5y+RlN,9+|k6u *.9r?Q>^iV>"La8@*[ 11dy}:͝CPpud[8 ~{ܴ1:POND  kզdpXB֘Hyy7qaz\uЦp78@V^d&]_ԨHJ}s:"\g(jv u]7]hޓ6dWH[VG`3e6Ȫi󨓒H(/n/߫z*1)#pN#72 `,? &ຌF/NBQx o3/\ ԕˁ5 F咍&%Q67={dW8$Wy ZB o"?:ZO8^&% w inDFOY8>_I&"ʕs\_)4&*BJ ,d[X)AL[_=5{nfcOJ A3%ϛPy# q3Jj4 Cj!(hn2Ls-5϶ Y ހ{l`:RÚTX[(FTh"] L՘qOzxi.I2/RᆯdI | ݚ]㞕f^"506ڸ+0!fʒzf@0-hϥIr5zU# %kHv/D7ceny4HfG~+ '1ZL#k$Tq͟Q RªJ `k9YI]ij>\rEOh Z%Sw@M_gj o- o a4t831*{ ԍ,Q$|0 $yY WH3(h}S6,zs/N{Qċ4xCo2M{`:z> 8O򔦳ŗ~&o >S9$ y1__&__~W%rQ=d'kyQ`@4[pTM}OE.,kA79&TwC`-<<*kH _U %| 9m@9 Ys6@<j`lH"k1uacoa|=D4q,ےjC,L/[\hP&3OuF_F'Ѫ6 us0XhKɹg$e̳+m絜v|&YQOD&Ɓ h@(Hטt:ˏ@W4]h=k9Ju0g_OpmTO~S|X$BKV+w=]Nj>Z5h%d 솬@@*%& P(A;J`|*qzIh@ߦE4qmE~|o@To5ų&gC 3Y]?z~s/33E ٝT'U+FOƀu8eU&גьl7_}񳙍_Dd6qFE`(>+52 ^$""9;\ :``vOC< vҽ͝`p98f&Lnzl<@%0Uhex-QrçH$8M* AK`|L\\cnC:W.A?Ġe%0wnu4˨*BM7 aojޗjOi8L. ɷ_vh1M;ɩW8ĊD\,hqI ?ݹpx]TR :O=Bio4q&Td*U!/RyE@or U5rX"3E[RLe"CY u j-&CE5yx~Cn0seU YG*di"]\*:?pi8퉲$|.T1@@Gl XMBpS lLK!Pm)%Zncg/J ,u"G9Gx!ҺYyP257]!Lzeqv idMH9J`lYKq =͑.|-]E7)AJ0N+@ڣV.1pҌ2N9fq(b"H$2v[be9qn- *B!BC i@b)2Ϋ1D+m@jUf`q.\Ta51$c-·BUUr®GD*o39$""-M!"'~JdžN$QA"3^!WEM_&$r$2(`fjjT$Fbo錥/ߟ⼡g!/C_*٢TItKBT!AwqP ()lyvsK(/D✉֞Il^]3O`cSE6\rw @R+#D\yP%CNa],WV[BX,~y P?9c MFȻ&p&Dch@#Θ64>+ 6(#ل'u^eC@~].8#$Ƿ ˽t 2xK`7efhD#- \EIp?hQ:h8p`'4?y"@(?w|J'Ԃ$iA#$E,$"0ۛ/2Jh| ڑ=ǝZZm挏_LBZx1bic(#B1ՉuqJ`k5 *(G<! K4vp!0Ѵrljf Vp "d"NioZ͗W 2@?p؝-|6v| ŵ%#ĩ(R1Aif2*ڏ?%8z'}X4?|^\!Zd OvՆ|Og" D(4Llq\^p% l#s PzigKȀ vJE`=U(*F/g+ĬFC`bգR)K Ny F9oU(=9}B j|>Qج^@2D+BRUSoX:>nF*(e_+tDyk3@ N\EIbeY5;fe55‚cGATct=Yiut .N,z"X%GIGydW($:\\JW@+yGk&#SNĂDq(H턫*A %0֯HG]PFa4(v\f^4$[^JیH?U0C#GG!E2f0g  7Y|+ḼrL(@E jQmE4eK;*&J3GRcueXĸҘsLؓߒ+Sdh=q25BH'`@ söX 0acQX`$ԝ:x(}YȍU;?K;Cm7yC@ EOch*tAGV]?k5uI],%ȈaQ>eA m@qBGp&Hu>fޜrG?[-+cUw3TJ`t=1fWNjh6 #)wSG2}UwVGiZ n/j7.c rqi!XLw0'>(Q[ehxP F[Vcy%H1". |j(%L$6 J`94r PE(,~!>ܹSڌ^ 87Qsoy:Be=Eڌh/7tFH/gAd٤cޘ'pcUUmW(T~A3K?Xq$ihO0rD1fi]$K18&f8aq Y.SiHgl ~?YɍG m(O<鬢ꦖP<&.n+J`vJY`'Z][^҃nUP-qlЁ9Z[?[̴YEwɶX^J0c|ƝI" !%0>[8Ƿ8-tRa`‡174|n`4E N8 rzZ,WQxwA=ȁ8 |H* Ѡڝbk<9OtG$58!n9:sly^PS1,swc[o3!XD!q jʘe.WػqJs˃4T3{څE7沍t9e(+҈n7#%TMt:5gڭ>BܱWP`G9P4&=d|>P߀(9k _ei̴hLOwF:sI͐w\6H>y4mbhn7-WӯI)!AAK`t}蟎)+>&>k̲:-}yW+zk`HvakOn ~%@m 2؍i^P0C 'Db=?SB T)Lz_z\/\κ!W:7ٌyl_z'8Ek]S:b|/U[ԛuG 2qUogMt~~l>\j)r+v^७Ǹ&.@v6b ^_z%zQv`_qWA(:8AsɆgB rlFMhR*c8U==O`rH)TFAH#Ih 8gM^1eeٰ_|B%t2>SXlNǑNI0et}N_'ߙJ ^ -Pۛcӫ>];{4c(~ygwU @ ϐfX"a#SZ@ DAK קrE6;gT橈)=GfAzpQ+qI403^`Eb]0yRI: J*J"KIɒS,g#/"2l9'X0]5eV_܅ T =[[lRwyEN' AΠ'9ʴH1O Hx XQu_ zu0p7X9P~t%HW-w*$_\1WWR'GH3 P)aA#4HWt$h-S{^ϐ>N;FθS^L[39MR(<.R3)[bdCat1\WMba%jM̠-O#fٳaQdɁQ`Ĝ'\(# FFZcJ+X+49R]Tqvҫ:a vtƑ& 9{Ù'pFlK$/}}dvOl>pkeQ4ȸd[DyHjͬen4 1kX\8%ig0)G 20a()2 }raIۋ:]g)Lp,߲ґe:8Td6?Ӆ7v{I/WWqxcX;0QIGXy.MΰUoegs.Dy齡U&ˑzUZ74دK UDLsq#4G.'*H:s9^`x9K HSbA2iΟ2};,ԋN*+@5'[]6xTi)gcҠT4#HR-'d`8F$g4Q1p<(mvƥg&%4z͆*T>CLze1ZGJ_N!U{N;:|=HUR=kO,Ո[PL(FNM4|/AMZL$z|,!ڱ\D6DO\bnq ?9~]x{3l5.7 qᏕCԉ]ˆ3f ȃ|8|&3Rlԡd@Gsê%8'e@PRPdDJc8|^ ie.m"Qq+U$qf e@(ܚoQ%AFX7^Sa$nZlU7e0$\~v?7if xYiI@JguT9HW6QCY~8q4 Tj2ˎ jKːa:Xak+ioZ^V2M<zGsmj9NPJꍢ.ʀ u1Qf!SxKrVa`1JUHQ(|J&̗?~ ®g CN:ۉC v\bNw:Rq ?ivTW\"u7&&NLWDb 3`{7hхҊʘR$^\6^!%&ѷؿ bwCZR)7UEt=c-揄n)ٗ&?)no2$֘QӍW[7+N*_/ɾnqf] ]'N0:q8j=W}☦Z 7v:̆K롂 }ZRdd0ed>Tm/9K'$yü̐^({We WO*ᒗ$$L]ѩ-NĖ#̷϶E0;AFw#y}.Fqn0"i԰J2kbdĐ5{VJ}Se)f¸A1Ur*$犙L㘠`+Ģu꽷NL7=So: u V B$IEqkd:S.b}b ~~_ǃ_waA V6º"h]>r0:GDȥKAĀy@v2xl24 Ӏ`ҜQ>T.Ty!̣^:-j?.7cȽmm]U3*Qyb0&I)%5Ix0UT7/fu⒊ 2tgn.7+li;sϸ)<}N3?q(p #1d5(C\N[vi #ތ_s'bz"Q(_ZCf?78)BܲﬓCl.iXjһtiIJP3 6b2.c`d23߭bT T38N1j Ռ}Zf8j{\2*濟P;e Nmx+5Α.ʘMv%{e}?>#usõpt$a(ٝ36%5MB&!ۍ~g|9L}+*H̤ݞ샥zQgY0\QƸ3ӯ59bmӍT}Nv̐o'%=ktX QK/V[wZ382}b͈@$H&&XzJN7˅/Ƥ)퓭/.RpG%_6.Jy42h֦0 fRXpekgHʟ$/oUMq{H{"v"™S}\:AF\8UM޼R7Vǎ]ӜJ=|B*O-*2|  U#<0K7m[ҾƏ ` o1$D1"\Lծ&H poʦ2BP7>5Nuj;!7hKg)% ɾAx̯::dƘOsJ|$Dj\tf:)^ 5W$áKmv !-43br]l(]&(URB:k?.V+ bV]YiEzg v/h%(,qMÈfmnMA޷J7x:]w)ڬ+(N+B@o^ω#\[/5e -p_ f27p+~V11󘘵Fokijx?Q_[NT4X{#ĩ8+DkĭuضqEj hQXR}~ƞƄ9:7U98Plѥ=k4["|Y5U{󯊹ZXJ.aK*bd3wSu`jLզ.-u|/b~L̋zx GLާwU8{vt!fsr5ȸ_vbn_5CQAZrߨ.O{߮q?}ԷϩWyNucX];vuWK-(P* KL1HP3 N\Q$9n+Q UgTdE,n˧p* ՞<£~ef8HX7nd4OriPK' jQm2~Skܫa@({/`e Qx%O .K@f*"E 2b.z2CꡰUeELkIL'`4nx="[b1 Ww`l ѱnWFܱ+ GyhX.Sz uu-@m9tAi0w|Wxsh~2b9 TͰD"H/OQFA?Tu)mץwK-[dn4 1kDqK$τa fb)ڬc7sGʗ1I0zp(_yل|U`n6&SaZPchN[FudqRw@WXcO0?d!@A}3;#aۼ/E8xW@y7AO!VvokVD|O&VWaKo}x p?iNqb& !a4 jH_~EXT5"^%mlƽ #WEk4te]z64'AFj;h@q{w]|{}Qu]zi$-0hB.E3[,K[1~~<\6ʒ0 3VźOZ_UZG2iPk.Ш ?wQj9栱NV m2Ua85ϫޗū4ctӏ-8KzYe?a}r9_~vػvV-L>)ۭ}n!LyJuVyt=ԡ ":aS6uZ,Z}ʢ)s<6/^Aםa+#I~ѦE#R GnyqJcp}pDKIj"TAXhJz/6#>Y'V˲ [?:|gΧЫ@0|W./3&yUƢ>0?<јrꠒY ^}a 0$<'(U2xS,&&) t.bgQ揹4޿ y@uxT_B;rC m^NN;yN,i#]Df;G7t:Vɂ mm'O;χW}T>J*}:䭵A^[C;[#Im7sָѺ!=sD@k}H@]ԁRA W60nR@9+-Hp^FkoJ3ee,Lhq\_†7ѷܲI5B~;IW4\ %:˱ ӌsHbK;vN^wg3Ūe"/<=_әQ(5oU*s&Re%`Epdqr}:IbY8o!\TzнUXӮ=d44Lo?G{6!&ᴇQnh[V'ȡo7^=ѩCqNkh^q4 70wOܬÍNZ*]*GҖ4abœ$҇FJHk:ZGv\mh|ڳz0q0Reؽp)a}+_ܘě4&|u? |1_}|d4m#rBmO|H6 v Zʐ $}5frB/')%S_0מ.{&"q ix! OO@p@8!0;ms6m;VHJ;и?w8ut v2kӽco-^\RfSQ7sѾJ)k:Qڙ1]A9mFH d>Xet| 6ӆjxIhoNq3tcvH zHϴ)9c +2-Wj.htr4Ս4 &%7HHҫ}9.d0͂r r |AW>h*HfgL9g*>]u(e|,R.E.h,_ : !v}KSm9i =X(-Z^"aJt墘.j}VS,(kY"hsioA)@у,FC@myׂ޳hE{j/FFn# /[&lStyҏww-&4'O6(PH,$AndULZ;HOtWm(><}"yВ;c$"$g"*{9G#= }p6QDzՈ_EEMm'ͽ?`XqEyk\^y*M2`MqګB*UF3ÚP1m֘we &MhQ dI` {[w Zp=}lzr5hhIsݝw/O4tJ+cf$ij|p0\PQvY>hߙLj뱿j.ݒ+ˋCwv9<#z8M PoV{Of_ww',<}hLoE.SړZ}),|I} ଇu8'y{Q|aW pzd|lwp\-,{^NE ng=dٍZF|,X</' ]XKm 3R^zqGF )fȁ{^BR$QzC§(L9^ՠ4 ,v_ I'ArdʑəL;f_q):X8-7xPpw`k7j[Zd!:,)74/)J [CIV93$q$'%dm=BSbQGj8=zҡ/`EjĜ͢CvR/@XͥkuFKfIֳ W2l ^bL撃h͗̽PI`!`cbqzsŇɬ9 8U=T,5̇L8>p1:{:PZ 1 Pr!&<\thM)֢0Kkr^({ykWEU䁻D9(!iE-iYx֐|$G.kj`_}ewO=ڶFHH3`Wx1֎c53"!/,cla>ZZ톮}WsS3N38t=n{t\;ZF;!2iGV3EJ~h,Dz@ 0y0mh#{ϒ{=w$ I6>=ߑ.k۾9pIRCqwu;l"3]i,HOoD&oblA/zhmYv$5hxjFQ4r޺)T:4<*s'DF&Jo1-b9+^GQΐ\w*U\5U Z7KWS_D0H[Z7 ߼O!FAM=P`(I<-hssIY|4Yg-lt6DqƑ^IuLcd!5 Zozm_hk{=&ʡmOx; cC/>.3k!er%[CiƩ?9FYε)m îyj_Z!z=(Ѕx-`a2T$~L;YEPhKf }䤱\Ϥ%fvo Z]Kd݃Ulp=p#54~]XHnʘ:^R1ԹZvť9jԹ1un|y˖CNJgd v2hC`=4Y B{l fTl-kPjhpi AIMŋedxdHs$Kи[{O:Cat@"9M+ 58DojOZ} d{uQ* "(2IgxN&*AQn5\]6bZP=@!RW9\wҼ@ Z+ہ|ciB\yؾqL⊢L9=2P|L*Nj9ڟ DIz-/h4kN ;|9ͳ&`YqfR8k"}joW m36PǙ7SF-m´*t!,"GҘճc$Q.[WDoi<{#NX9;f|gn߻LfRc7Ol4CAZI,& X$c6 BrI&pC}+YS۬Y1 L˜L Aɸ|?>7C1m(i#$zH,ګJ@xp29b"af%+0+D0/!54]Q ;\OCV V[%֑cAnPn:c{]I {JL ԊKqܒUx)R⋁@WbszorϸdfWiͿ5)\"@* j+S '_h#jAv{gl̺'/>Υľ~= vqH"{f@a?XqղQSc@q (>'%ĝ_{miL$CB =X*n/  ok)˳ I|fk^iΝgzG#mt|8s/ؕ'mOytB?9jA@tyr2Cq>l;?|Y,|$6 @R;rp_xɚ{ARcl=;&*5~ 2\}X!*f,p7!66{I3#/ j# ψ>imCGPgAha[qzL,.[ǐ_Nc*)%F'89hW*s4$Q =:9J9f;9F̏(tZyԣ3:9Em2=2"|i +ZpǤv5MmtOohVw]#;`MO#ԻTYK>.VHUjʁ M*d1ԄT6LТ6,L)_ #8[LR^ yjܗP/Hվ_v7#Gv[Nd4Y ؇@8Y_,YJdq,n /b͎3+# hE#1a`W2}-FЕ} ^c< >ae`OɁOyO8)c8k0=Rs 8/Quj˛E)C.{2kQ}hBZz#k:f-K/,Ue|v>]7 oXl E0Aĵ`y7ezx`^|:|]lx$ {L.6bR!0YSua]l"y.Zg'~:]q|"f IT(b]_Wp}ѤBt3p6Bx%c^j?hdYBi"%ͽ챵CRС7yxJɸ6bHI$.ZK\vf?|a5hĆXbp`3!gC )2! :W2.EHC)}at%wSW]*7wWU-|/܏DɜQ8}5]#8j`+lޙsԃ91B7`_X3Wd]Ro&S<@rw7lݬ~wmV"dN|JPhV4!WY6xVYL\I*WYl9vb)U e>"``jO 6+^WO~-f#f1't^ [_E ?~=L20BlmS[IƐT rvMɡ JpW,S1jڬ=ƹK3rѻ$ssXD̦h$%uDv9}[rm.FQ1xPB(ʖ 14 RБCfX}Nѳhl7P0 ex!C7ey~SAnA% ;$/%[8uVݗ^yhj"xO 5ǭ׾k͝L{Nhd}5>nƂk~v,̸~[ (kwMƋ{ڽdj F//L^H-=c#ScI5Sm?p/yWY~O2rk-BYg^!.oW= 0S"zq֝ #{{e뼾 I_ U7 H+|$^oZF[]7}I׾5+ſ =p!=ўP<?Vi^Bvz x~v7T0|^U:)A{YE,;+ŭ7+ A {fjhߺnHrsk:[Bs3 ^*íD&tQ?5I~t-zp\`1Q|JԆs 1 %j>f6w>e86T0ڃwT jTYEגܭE1Ax<*BzU,X/@VtSTa'C5Hlfɼ˫ {cmpa.'N,r'ZW[:IZ%-X؃'HKfF!rs\tJU/\5vz -Z[zUPTŀo/vd ̸&#I2)R\mZ0x3͝z#ڇodĚߔO7٤W߾..WvisH~C?)4&X)6Eqg,Sgbyt@h/)n[iڇ_޼re~)cƦ+go>6 򌅍 bsl҆ [1' 6]&l2e`6@wɅqM.e~ d0>x ϺUxd^y2٩!@db#γav$W<3:~Dvj:z6ӧiy<^?uv.A*cH3y*9sa׺ *0LTU3i\gf|i3X0dx3DðZm-C~f?4<Ëӗ{{鸞7mN=A7y FG-vo]4􁵶;ڈfcF<䑾 Kk]hx^u]vlYEH=hboOTα/g]13*t"<*Ou 􃩩65leSGךּд|OUi^n0lTb̧& Fˠɔ!'mC|DssNNI]@~|ו5,_kW:L}К:ӇuDTޙ=C<JƒWmLCxR@ڙ5OBjnvm&:ܳ<Px㪽L6{#ɻϔ`nRwrmז4J1ToluoFm Yn[-&Q󭇅pؒ_g!'Fk D#OE:9N蓖"F-V0[?f2M/^]X `JΪ&f]U U! K=Ye\K 9cҬc>`ˡK]ΐ8#Z0N8^@#P =%iD~H*Ho.8Y}06OL4ر0+#71 l&MYlk(ڇQ2fmwZNq-nzw!Gۤ cL$yQbӰ\-UBA d|-ά\KeX]][&.-lD~Yf4ShV \OA)}B G+}N{uk9i>%HV S$Fj3`UVĪ3x=T]I%7K.Db#ӡzʎZ5F1YB4 ̑tĜ12=nj-y3_y2=u Gyom mLޯ%♑\4dG lfɫt̶܂z-Q^AؼL_:7ɹ#gjIO;Xw^vu$/չm흼8}Ëh50<80zߚo_2;u-]~l^r<襼q+_.9k9+N'j~݈'$/m\-o7i?g0Mz!co?}G Uyx˲:]2+ wpS,Sj#ToΖmi^<]i Cm k@-H-ўLLcɶu1dx:I\CT<fR\T?Rt*uBإwiP `1W.u$;I"h cf?4$Y|O>(DKֹPq/eX A ,`Kqu} Pn!*0-GixaḲ ~уz|~TdX\ʈnFtKiJզzihj[m% C@fAW$IXfuةn7*0(uZ?ּu^hG,U%rSw1&Z$v\H1l[ڰwcI۫kZstۍ%6»H߅`duU>t\ 3O0ʵ)WTrSjr",C*ٚǤM)T ϻԝ dX, j58}#RgÇ`>uX&f߃<Pl :up$+)m o$y ?u{bX%=x!}.btvo iՊG=m%yb\f.k.JcuG;og*K<*uy313Å0 P&ۺyմVIZ%Mk4UMkm^6/vaz}d r]}<o A-c>krKERH/'JvPB~D,s5ntCd0W !rlЦ&κncx6F0e>[eJ{՘ȁ)V/kacE~-DބUN!ٛvqz$iQji9\J|B]07VDraYsn(LZr?IȌ&y2/ڸrR~"f&QTC6y9ІЂ+m#}j.\:t zךAiɺ+G۠ӷh[yv8#Vú7trs"bg.Ga/E,#?do:g+8lGu\v4q缉:b{'oJf>`E9H!Aӿe'7Euq=g 3Ul)FeHɢd}jC &C%&,#w~.E'W:#Ȭ5xV#7]3Xy|&]ZQZtW x ) V]C-tc `Nt{U'[lh8ܣ^x˼N_7]0ap_BlD qÔFox>%Pok]鼎;0&CT\nuWI_%]|tU]|}o<2al.E!:kb]y2Q,hV2(6,MX7\Ly0;fa@yYG1+0`9@%9fd**Gd1}ei̎".ԭKdt+Xҭqv=u%ό.`F$ :֬w^ņ-qЭoU?=" ݴ|TɆ Ib|"dr5am`W6h1.Q6k֡52EJҎ%S\;+YF=h㲩^P:|ލX'-~R+|X?S¥St*w> 1)$>8o-/b保I%ђŻʙXm']ߘM8;cl 2cW@Ow՜AG3UCj|{C@BbY1+7t^^;Mnvhs†W7V }…۪ƭV~_R8^ƃKY],鸴C^l*QK^TZ?JwYOW/ͭR*%m*lJ9[e[ Qzr+G=±Tmgq"1N&4-2aM9o$moZh)$KǢ>Il? mM?b<2#DY-CrC{uqD>_կŪ'rͮW=~u4b Q':t 4{':p4K$.|4<#:~ŘGM__YhZ}dM>3VUo}`շX/]O;{uΟ.)B:H'Tf _B4<}KS&E )iM 15qd֪2sVrgʶH 8XO🹞+xivO?DƲtE/B*(:ipKzMN'NotȄ <:oWeQ^C a]' mvFEG`V1>TO#v?zB@?5!DF}_*HA=123 y b7}#3(3=2;n! mt9 wOԩG3sf#{ɱwMd;XX3xytݩ4x_Yth}| ~i/MK6Q)SN!.fbpZG^8*9z\JBXQxs7GcW@c D4㣟ٍW[ڇ@&!W=q[!%prrG?mHG1>|ݒclfR@#q\FKCI|>VYaJ)bOPuзBaӄ~boF:^,e3eS(d!alXFV Y~3%omGfalmI=l.e4wJ1/GʒlXbDD/Aq)NR$2yjOXQv_c\>ĊgUm;zzw'=m-$ R!&yxTFl6I4UWO$ Ergϥ@[e$-q@HzOx ۛdTrMՠ]vVl;3Pt^zN#DsĶȨmkF)d\H=SS )k.J5UTՙ>éO:{s)th'qkaۣU1:Iv+^C.AGlDW?|Ç0dJ#%]%4 =}V0 -B0D1y$/;"gPo %ÅJhVl.\-_hB=l{]HE)+D]u_Ĩ[Rص1),b̻'˖l1v?Qv(%xl-}̼ۭb2_;m_5n4dVˉ39sƦ2j0'r%Ec bWnUP{cRbj192]aSn U3$Mf;-֪ѫK6%ѻ,KQ[ɋln:'՘a'SHob9klT`6=PQ쀱Pq]mxObw1AdVF_g/xcVYMUlcϝK_$/~帔BH{ٖLti݈.47Gǘ g%3L3SHTJZ'W+}T J-L#e aiۍƾZSt1U:\MG1琇*a$'kt¶ظAR:`jC*"(<g{x2F?lQ8h`,˒^Z_MTC(yP=zg@ۘ)eu]&&O5/D4{e+ܵdAߏy9p&DJ3h;ΩG_^n0gk6J]g- 2RU.*`n5_~*"E0RHwEjjzs .b*x}!>x$=ƌ/>Id7c+)ȃOxu^3<#8{AuT8??nKLJNkOΐ9t'(EAֈf:zDMF6[{s i,dX8KM+)ӭGay (w`c^&Ekƨĝ-j}AJ

x?[L47mϵ$t9ypzrbhaqW=Ѽ\^ I "Q]3ƻ /Z˴iwk$2 #lh{ȱ~mE_`:d#F7&:;id['nwTS VT*hۓ-{(A#@#禃qp-M [B [5#UV&:>`˙ ̃(c6Jx¡U+"oкDUc4t%ƨT$ uQ*Vբ&E2-wQB3T Ib'DQܩUȶٕ2b[ȝ9%A؈sڰ)ceÿjw5iEiH9H/N1mLā1=r74 T:a&5mUF lL4WȌpi6Y-kWQ<į`AyQ>@imw״SM~Pr9)"`X08+LU'Ky9Fޑ^Vw5wR.bʷ*w87W>~Y7Z i!n#&  Q-7m/NgV#fٝkLewO@1r|uFL;niՙ(IIƳ:|7E).a-f?Khώ2pgЂlsnGz,J׵ ?<1kKU.A963wco B5=D>^Fj!|zH'ڽC z()t}?fƯ?S`הqNLDWr\lV_`.2FR͙Ą+{&S ;%LњR-EJ 'mhP HhxDm STS!y4Sz&o~@#61C;5hNjvC6cAcV[YR8Ja4*%p*(HZfa!?xM7/vmA݁$Z5}KۢNB>pll,l_ 5QvarNH[VM5Hj}P2>V3S!k5R=Lm+zU(O 45ȟ/_8}PlnN]̳}nvuH`fYE qD>#HƁԊgivhTc1Rwl$3š). dAcb\b&M(aGoQ s"?Ql2DINo<b@4.\Rt@ tYMw`ג܋]+j.HO{"-\Eω-nss q>#aOw 9O~$}^e5Їc%1h@hcp3b {8V+aw*;l:N GBoQ*Os0cWBцq{3EiSv1";NG9/}*e->AjTkaϚrg)Ei)Flj݁Ll4=io9ǝ-#t3| ,{#I},VYUT"[>$U|p[6񅓉'K u3a8$XPo|4âPjH|Vi1ePX©h^mU +YK.ܺlMaʇblVJ"q_0#qiܣ=rMm:[X.RaрdǞ F#8Ueg .p~MaJUFOE빖p#; 5 jl&p k M\ RZ2ZYxtKX֛(^yac ]&FR>2M K<0eSeSe4TQ[+SiG诬&`I:d S Lזޣaq'2_[c>LQhQ$m#üyݤT}5nu%)A[%nKR$u B+m}Mo_+4cKK\ҬK-_vo!+[e'(*Ooz  lQlm É@eQ;|MoR3rqͫ\J9$;L0;&rVdV" 7WLfɤN",Gf71~~b߯g՞mňOe/Cb% ixÎ6iYͱͱ 6 [kJ理Rqͫ5I`:nl f9${*|[89'6,%1dFJ;q!1c,#tZSLbl=*%(\fw枸3%fǙάc6 M`vgfwf|ʈ4=w[42WSߢk^- \rk;DŽ!j-:d?S acǘtZ?#*f^kk&LFCF +3}3QRXݝu֣ޭ7]chG{_OGn }1*bl5ŴIK\s8H)PXQ؜Rgti/˾H"Oh*KbF(1t0kncBO:dYy4]}&B:>"yYECFa\Ipޘt^&e ;9ޓ7Lj'RTKQK3W>pKTtXκȽSpd#jy/9@hT\s@p4LoXW)f҈΁ 2aq'ƵIcs4tA 'mR2ٸoH5,v/\C+tc6vNLgֺ-ZY˪rG1*v{Y )* b* s(dj'{7I[JAZZiieUtTQSh4 IMe)BeVEO/8#G^+WYe6XŜmDHӯet҇*Z*B)`"F,?\6[ ֿ 95)_\}>,כrqc$ꌲ) a^⤣qB>Hw nB-{p1tKn FZe%CE尌޺ʃD*CUJ$!4VU)$%z0n xY#svgdJG Y[9H,6(zW4ΗjR D8B֍hn*a1%C,+R@GGώn!R6D=%^kgM 14o 0lLǔ0gJRO,ci Te5R/Q |#80 謾MpQ&R+Y/bѠ^u.m5))4\b* %vG*B*rpSI(>By;fŦaVeu ^I ˄< ߄-?d, T3 D!2WL4gE'zKE[.t 9b-thfV)%/şÏwsmOBC^ɛ@V&"Xbv 2DXv2BnVހnnAskT]1QcSnk\$:%l*SF.*^FVcw}-y Bnv9dEcWYh mĆ alcb2g"Ӵ?8{ |71Fj*kc2[FV3GbƘ-a&0Ь9ou u= D}>#!C~ m ~¦'Zws1,'~NQ(6,ʲ5 7u ?(b?N?gXnfS]S]뗸x 5CRAZRRYr#ۆe?#jSrrfUώ=q:J]{sd;f@(a6i4K[WT*Mې*Rv#Ôo`ّ,ݢ<& E)E;"p Sx&%a=?ȿkA9RYۓ@P}9 J)5r2G*;EfFK2Y'6%*0.ѭ7p)2Hے>iky4 vPb}|gu .~?/({x3W=?r^vշ[|99|ؾa8;x궟YG'VшʟPtb ԹW&P }Z*;*zl*zgއ ass:ӆ ARj͹1R71G;d78-kԞak_?Z눴wG#56iFQr4.4zXOGВd&kB]ݸk96}`L;V9!Nc[zԭ*Vm$ i.se q!T@qfR9/\~z[=`1RtDbh<O:)B-]ւ8L}g+\'ݶL{޼=TʻPA> wxD?ɒjO 6QT?- Wt IzݐHH[v᩻)n_':S}r{O;=$2{Qt2i/ 0}~:{w{vVTi{vNf/xCsKo3<"gMZ_Ar[}]*qs+7Q5BCMߜ]l.[q 33gg&~Q|˳ߞuQ'/\3}RfXq(Ɍ2(,bI[RnӘ r"V+'"њ #:i!׌;턿gNQ P^Y50' j3IfbJ޳n?>Twd~ϮqfjR =JJ{虠B[L"Rd}t< aBhb))Ž{ !5Ȱ{<^|[\J1:Cr1sĿ(059# `)&q$@1Qh"*UJh!A?W}£66ab1pIx0쭊?b d' bb Ö!nQsÐFA-Pqٍղ IXcTP R,_@(24`a}" (9iCU حVB|8%2Lp&`4$@9 YhM4q<ȗg)рy[k1Y¸~z/0U+{$%@m3n`ᴤw2!y9D04I*' <x?dM)aw7bӹ{m-T +$c.NVxS5z7|Ϣ=7q7Nz /V7gCWsg_pϢ&H7!OETy&FY0Ʉ^#*L 3'y,5t ԀkJhs r6 2Ֆu< .X@Y3LĊN @B[q$@ԁR1Ǐ`EpVLKm DoBhTC^WT̋>ʏ5B)7Uof&n!'ɫqsh||Q4b^gRw0CC0/b _΁dwVٜs6&_!-HbRⴽھWj,H\{]z>6M z\Z epk:i:P,6\f́6@ʄc@- *ڵ] FacߕP6a+ȸOXJ;"ΖU#:cĢOH03tlc=➑c97KBpΟ6#eVOP'sNdM_3V/5[xWn5͍,=B442̬%DJb#.o r>A_N fs })N޿fD[9Ծu RGy˹^Nؚ$Gm5\t`YW#oo*axćI?qfI'3>9.7(DG շT+)6jHrPD¢ B+FXAdTuV5 mb-rÖVDOmMIP)b`hE50B6qQ|SS8])6~%ԇ2IPV4T!zP{5ba)`z=(Ҡ^+ /LQ!`i#e1bpϙbs}rjL|xd}VWG%mp4wDNtGj&Hz &)@iӬU2OS=@} QCL6 …pL T6~"jd54tc??Ѻ_5lC}3uja^_4k8riuOk<& о;p10?w,S:tAhSi,oYro{µfxeȡjc3~ pI_SBZQvO)6SU/NRmw cF{)2˘+|ƃv6ߕ2,MޝL%!Vu+0;/Q! mDFBy(ovŒߧ)}̟m]+s;"m{ܟg9o;[POT-h*Kj+Y`cǕh|K |BM _D=mD\,'emg[O BD m\oTwv7ؽEfψ3#1"BdZ #ׄ!FyxUysM&G=n eZ?\p+cWuVkkw/ojR)A"#}4{ggɴ6Mi}l 8b)H˪r҆]oNCtiBx$Rb7C WJ︉AL!%/YԲE4i@^@5|j08cLkD6p1{l̮UdĒ1{9[wWNوd\C]MOy슙(NkO%-::FyZKw]q/Б{Ōתx#R<0R[t !B)cV2*MZw5߻odi{{#Բ='ZQ/:߼~yۓertCOף4!K4!+F(%-y!#Ŕaarq1d;2Q P; FFWl)4>|\ lA_CZ@Z<Y u`~%5 uOFZ(d1|`(:XTX\l&o2~D΃$/HLa 10+& ȹhi 0QA{)A k!1 6( ^qN2V0E5 {-X-cuh%4ʀ2A1CJJȍ Vp "&"+8ϭ(rʁ7H]Sm/74c߮ESO Y~(\ڋV-U f;oQ1!(v.Z`YFC:"N|P>bJa4y{U$4dU  $87YDe,03orRrN91?C ւb$S ,rl`246uׅH dÀ1B^9t) (IldHaX-!亭;;3pwBk+C ;M}wy*( @G;?xyy ;%$v~\S D<(M^j>T<+M+|@՝Z%W׏<%9&o?#gϯ.xNaW7IrI]ld:njo: dzp*eZC|ꏼgrLr{Vwl3+D-w{m;bԄlw SD/&[K:eRdlL g8]N|5}%< 4fcYaw0¢ǀytEs\ dzwwk,^ ^|%w+<3%kx.NdGQR/^|'4xز=~jʭ\!sVH PQq6ւ`-Wj6T=|׋>L>2WkUJfL6WHIprUm%~eG`T .Qq-:q)$YgLhDfUxQK"*򊕕hh%F_a{d*N}Ε\}~׼ PRYVz۷|DSdjnWJ1LAQl,E\=oEycA yMJ M}n~ >3/2L/3zWn=3{|'fycF-I#$|ʈ(;Mc$DD(t[Zy9A_N~[餒ϻܜ]jabWTTtq_yәK}_of)nW6͋EŢVy6# 8J/IlhncK@N8cbR 9`ñp[NC +H;nNO0N,DBTup"qYa.⾝IZ4ϔjPW pg?y޳mlW|["HӋzAŢ !g+%EA!)zY×ZGǜל9IkL*hB?ܴ9Ƥ~bC*Fl5͡x ަ./1 X^_S˜ol fl}L&;->JN^ 46*{aLXF1KibR;pmSY]#{}"34Q ?FĤ171sX \kbRr̩ 8oOeU<Ϡp Kx@ͣTT# ^ & H*Xpl|Ț"֕hvX}g\8ZXple(NGK #9ii<`Ś5G\!Xu.ohù(P.g^.R }iE[¹U iTIG jR&l4b`-; 3FPn2Eh T%XO < ΨUuϤ #醎%/0Btl4d؎Hj@*@&|P7`yHSH;M+swr+؀[C~Diy#JSrp ^RphrV (?ko%8J91pU l80;%cJҪpW'Z6KҪ+V]wUarLdkӀ#^oz(plX-XnXIcXfI(񾱱Ҫ3a+U%QSN1lT }xئNJJeejӘ Hd|ԐjSf%6oŸT ^3Tsڭ(@Xv @@S0V TW$m Alȟ* NvX<8H'T(ᜋj'YBgjJLJ] %'ʘOeҹjy!يC:jFQ%>(g: #YZy !eB.]ل5杢'Ps#œKuЭ<v4xW϶j˶;9[IY3 v i@9B% 2XY*MPE['!|<&Mϼ 50k-@ݮ%,utX]b_pK`"ƁBMX r[V <}k:+u$:,u*;و}TKw@B_MHqZ $Х^'>~/ *%uR$&$S@fyQ@5>Z#HJ2-2O?|T,;r귳kbzm>~bM.PP= MzgC&qo~#pde UV.u/Y{; B!ДYI4aV1P% TxQ\P ꢑT|H2uBPy3huT [$ ޕ ROD9N<\*W}U>8%OjUs֖b;ɸ:[}CdGJ3L1Vj]UA qIy:Eas8;LH߁: ]჏:|j|>jRζw5UK)2k b'8Z$Մ/bM*rBb4 8f뱭j-W"c1tMw8'<w˟ri =:k7[CP wC֐h*Z1mT[ 78zvZ<ӕ9til"uAy^ iF,&8-%Q̱1'lk<}}^ ?o,)oV7*KZ֋%(PBrvD7u^:@K;g?z`JVho 1YmboR֋gJiY͜ĝk-Mn ܯaM$~4]^62)EI=^lewq!ET4v*"\nM3Jє)"8ɥtQ02)MJLшIcKNcÌ[DAS6Y01'r@ B!(]r ]{g.?_x Њ׈cyt־Pk۴=&T}@Q4qHhiLe">레@Pą RVc|38[e%s!U|}fjHd@ e~zߙμl3Dٽg`zcP|wمoD o=s *oWYpG W$%)|4fn=4]#$)BoLU6o[k,D´U"UaUROz$9̷TT~;Wb5#EV=R-DiWt.iDմ1@h7# += F-bZJj 0.6Ke0z%^wtr/&]dpl3Nr{[֝=={>Z7M{2΀<[bra'~'$e8{mf>)T|p$.ԻcX5V]wOn8o lJ g߇$~)_pB؟B?,bV_1$:oh|=1r7?ro B闫M ݔmpl\>0</>G{Ly6ƧT86.NFc.L />'\r;~x]0ylТיrVʤ ,a19\ϣ*<|#Y0[^W;v0{0X䭻-ڗ̟/UϦ1|+pu08{$P b-Y<]rlFt@3V`VܝWZ}p}ԦJQ(Z0/xjxUN"yCy';ʠ?r[^SvKۤv8S)؍4TG@]s7WX |.qR9l)+Ivʗ}%a83X-j4~htK]Χ*E|*_?Ց6q ?[+fP88'=P*8LܦV<9?M=b8Z_ NqU:KIaX>U SIs湢H .g^xDF.0'AZsn;o5Pi9K|CX2}nl:.~:*C/4ka69;0#C@ @"7uj|32W&x!yy^cb3ȁ 1)Ue0`/Сlo~`b4EoN8'3* {&H1qyyo/YTҬQB]rheՂkZ'mu8NwpT~`t^f{cۻtƽq?ϟ'f|<EC>eKGO^[U{̤8g]644=v ![f#|6)Z.ף%víeywna6`f1Ɉ.>m)`&/jpiFQPCAd&:ID9ӺU9׸|LE1LV(SZf)tbzDDg7#7ܿ+gT~ՋpբHW ^ |C3Qx&g9&1}D8)xc+AŅRGzI}Iݡt}뾤뾤>iv'ƌRYȰZuNS/!_wEXHF.*OMTw!H4gdc(EEIHVc~ vy=qS)oQ(W,dĈ4T2qBK:;;+U:ʹW9rܠGt#LE'ϩPL9^#g H]̰|x0{t$/'8oɬVϥ:?R7tFt`~ 'ogSEˆnZ"9zZ:F.×]߼_n.m?WsLݛ87ZBVTu vT39͠4#+DYnrxZKF)dtE=y$iǟ(fbiSmsTdu+$ k=*uUtTܦ_J߁6XqİVf4vz no3e&5F1&#s/ƽ,xbwCZK:}1 ¡uLd̟ 0љ]\Is͈bgp-ppi˙.aq_±8&WB*:T0LgЫ*}URl1?GE'z-nN#Цb^nr1:ki#砄LZ$ލ,ޝBKDf+ 0m3r(|XytrV7] ڝ 3җdlҁ$I>@|93s3X`&ʪL C{CUB5zw~Ye9^01.?&ऐ Ǫ_wӻxg7Q{&S9&n!S&T%Ꮕc5UR1;4@ `cebJ}́Q}43nQ}OXS~e43F_.˹_rױu,~].x ƙf] ipZùNriuj#:P9[5fn@_?^k/_為me"D!o0eYfoLrh1 vGa֜= U6;? 4sx?~ZS-ugtq5Ty 9>'σ5e] A6̑c8!ܫnbrs|?ͯaeJQo9ȩT@U 90sJڐ "ϱs&[V uWZ*BTCi [_e0*K 75bZeѵʢkE*+V)G!'VM ,y hB%XmfyJ : 2@Q YYK8heޙ1Mb@$p#, \HXC-P)D"Y.AcRZE:fZvĞbd) Dm0"@Uq9eOl91|7:f߯cvr8WE$c^tt=6[ ?,_*RBwZ>f%_{?n}| +Xpoqϟ~~2-l@ şB/?ۅHF#x>vpO QjӽW#<c_o߹q; VJ X[ɕ qf6ތdʩ̈́ 35|j`hq6>8ʀ9ʩ+X$VI!"O繹d&af s ) m܄&T7 7RC>xbf0g@KDcp9U8" \AB㱑rD0 05V c.4o˧kGa9I* O%$sHя`%Q B))-OMqйg*G)5GNea\LC*uŵPHVIzl-^$ s#*R!ڸ$QҢBr_8 KOT'n\"'${ީ`B0ZU4ƴ7&UN"FՁtº*$qsrUUh ;:h:^4Su`i"߼S"'}Dp8}0\ p9ؑlk;zà t6{U T/q?kk6 qDjg 8+T[ , -J?Ts7A}"r@I31DzWt x y@Xt)0CYG ,FDTIzTB318dÌP Fv; ǵ*_9-ţ8XÊ@Wy)Aۣx@;DTWyD"cPR[JYE%^G,LWyKl!`bDգ}y@9*|aҎ݀z43hro!tƺ ֫ }:f(!AjW8︥܀ؖG$j_7iJ`e{'jxikbYYaΛZJ%L+'X$} STбzoͧ:Z+Mr8u`ƥX0SᥥρM`mٛ7㏣x+dO΅S 9in4n}~`Ξq_#3UNӒYGV~,+w10hV~Zc-O6%GN@fdH_&S@Kk5O{|ךW\QLjOa<ܧa|iAZ$d1TinjԞÁMkqstgQos>Ԋ$1l8Z .YeQ A/|1C|qc4q5XEH15!-a98VMCZc Aimܶ+-n6ޤleXwsu3@.J`u]!:Hv;PIPU{„;, û.-R"Mwc]6BZU?UP5= dmNj)ۜvUpJ֐vc 6=:vj%=HVP78 mng!ӤkJS\1cL0tA]铋p7NDJOWbW>W&q$''ɵ"8 ý;*z* HjwO&!"*$B;j b/?I,A6+\1>ug9Jnt Mz[e7y9D0-k7~V%.-*+$DM;n$$9ޑK`64Z0HW{(v$cXka)Dlj5^,it1%3̞жt#fLu9 =!)x5'? us~BE_>t.w!=v߹O(2,DRoJs0QBГ3uosǼgʐBA( |Bs]l>)/I}_Ih^3 {m_t֛mb,vI/6˰ܓf[|kUDhL-re.9:g#dDYDL)0&SdH?cKHL.M PZI k^H`Q6ZGD>*<(j䂂bPKz\Z6Ynl%5a1;\ȉiBMŗ7E Oh QǨFhKph51`( GDkRSdl1K'q= )t*~և_+~vtHu.Osi>RNyNR).@r٣!}8&X"j$ ƔFmL"S3)(ebHF*5-u ƹFL%m[s.`QȊnLR`)#;CIGEP\t<@xYQ#ݕભI /f7I?}6fL~rכn귣`o+脆yF) j|`)>Y`;2h;yŨUH$U;5cA)]4G~ ]<|$rH&_;18mSm`db vJ1[ccRwh8`f9v T Iđ\35L$rA8FruO_]pڰ@9:|K,֭^¦mYrosm И#fxr lMY_($"|ߎRO4L#\o}=t4pNQReQXO`qs{ƃ4Z gjqaC>DWZ@v0 cAڠ(By|2#]I9YzfʁzHz6VJ{.XF[-,m2l>3Oީ30YMbjUړRp ( Ud0׵ײ>6^Q ?<Sa~K}Ns4X.0ە üǿa1`r'4bw'ieՁb)`efehq˸fA0mNrGHVZBIDk0 z-VR݄1k/v0Rg J`9_~PCqA:Q_ǐ"Eƒ9^ïQ>hM鎂JoG0/t%˱,m ~j$CFqlN 8^1\DweQ+hAQ3( [bJDXV겍w6JeJ3kV)wk%1KI09׊IKy&Vbn9HxYHX#\r;- 5xЃ':q@E:)\&ϺhlYWReqP *w\k#-5c59IU6VIp :KhgMJrLk*$\4}$a!b&Dn],9SÉt! Uy[3&sؐۨ Th`ΙmZؐuPF C&j<6Rt {o5ڬd65 n4kŃL)i.Ij9 SOD̘ѰB5  wmxOh2_֤07?L/r:87|L+6 p&\!&7xo0ͭV`-6drSRVeN!3~ a?ARaօ,2SzS\0!8"&`üA?p63!& ,dդ<s̛Y"*r猣` 98\"`-1BJc`{HV& mE*{N/:ˉYJj믬@S<=ݿ&LdJvGVs>'S(cWtVGe+8.R8Nx@SGzB/˷xIZsN(+ĹFݨT?un@)A2*HYýq:-,$'0Ir-Š hFk غXI(^`\siUtpQI~<F l%:US=S}d V:Ɏ$y:+*etjS eH4+bv68)NjLv=,p`/6L 1f|:Oux:w ÕT/Om$@J/;V_ eeGXvO*a,+6A}J?6Cr&_c9m8Ġ|uH"c/;֍5Od #1J( _-W]YڳޫJ8%|#O`@ g)~sOt(  DXycKJ wЭ_* IF`I3Ft!tG38SGv9gm@s#x~QEyH,=F 9F꼍^{ ߉ `1%hcT6+Z1]i=[@6E~" `0%>J@7cҁ׆JZ~ W׸"9B@:fm~Zᘥ;/K'I}@)"+'ۈT! }@ Ju." 63׋ucA dk StHwn-5W:y)@]0Dk]:-G(GQV#W6;Ĥjz (Th\FM5uc2Lx{"ƈWd#Y}D?X$D. #1v9<,dR-ϸO G#3D8SkR%t80$SڅyV8ݣGAW{r/N[G =u.nIϼ8 \pj(9 QC'R2ز8 ~vL@=;ܜqi߱6#E4af{1C8'z l6/¡ *Zqe=(]ݩ餷xRzLtj Ln*~Ҋ&`JIOat4AJp~50E3X|'ɥhzG%ur`SgYζl)2GG0pcpr2o HWBǘTf94YBo/EP aU|ɓa٧Jua#^ {ޔ?yFZL9lme4Ų (!PHPm C##xJHVBr4>RpsWNWa;W{G-o$r:OAh]*(t-Y݈Ʌ(R;ZT)Hq^Q݄@̂+s@'u5M)F #o7Fq`Z^ᒐf)W 97=۹ĩƌݷfB~o+!_$2QLx'$e'+'PPC RBA* i14FЪ'FǐtkR2GI7`=n$Tr%-7L;{Jk)ތ0Pg7+ oKQHA b= HRP3ZU5.1[R(.Rmܔ4JKPURa鶷m3ܮUj `U4KuIAWQ'.(0ӫS;Rb5.US?} o;g_=V5bvyI]/vKT\w+P=!tk ;na`b4Q~m*{~#I;GFNGN>/yةB$HoxxqZ=iAF\|G[gIz}Zy.ԾYάOXOgkv{]=ImAG?8 5(- )0;\=8燑+;/QL|Ϳ#՞#{t(`de4LVSO$ U "=}k78ui v2M@#jܱJqkk(PA~_ IcnՍU݈ng^3$Fvu^勋?溼>¿☹>;ExN,|X}C‡!BMXȐXjXG+E'JSe~Xrƥ \Rj8JH4'UY1D)o2Qci2GM!iE [@ʠR) DW19}mƍSx¨r܊/ cVVQVaV_#̥ 3b+X"Ɂ8mhQVRYjLU} ':ߜ6lm<ԇͬϪl0KMÔ?9lKSj҇nr}vR+~֋63kŚFPK~yqy?7OȽ@f+ FɿٕssUf;]Խ[ЬP߯uֆ[3=rXƔ|4ήPng,7$_OhRub:ψn#F_OK-YPBqMT#&!Sn:1gx1wb2wK4Ի尐oDl$d㌲~ԟzSy;92 U!J -QЕ%  +O]ݬ7v/f+wۗݺ!|zs/.)g:d?*1+D$=U&}r*Zj{:SEbn'ċVbFx%j=Mun"B~9]D 4 ׽W\߬fꇟ:GVw?ԉ%#pF%ABFg 4'$a/9ui0R٘I4Gp`y@xT憿X~i")9ZiDY>FȌ9:2 BIՍ f7[EįW>my{ 6m'W3EEyqE>d@ ι\.~sL 8L#[E *Wk]aIRA3eܻͩ ?3 8o7+Pw"a2nIrXCsOXRj*+ 0.Ҧ(ZE+ Q򈳝|^V)6(O4 ,5l6xIDr^W򌧏$Dt8$RGCo67JقU"BBVT"~~JJADw?GH{9D̄m<ǃPO<_AH{9 t=]_2,繓$aq')ys 2r7ȫB9X*}|,/v dwGSȽn؄ ~ϟ]Y=7j /|usG/xmp?[+oWޚּ^^/ X 8g R`YZĐ1Zj6|JRE< Zp4y[cr uu![z.gJQ7m+qI~xgHF}&ޚykF!<|CwҸ ª(GQ 4')gc@ Ct|Iq)NHʟ^Sj^QiH !ZEv3[)m" OUAʂQ:vJc(奡T` u( +3^"$*)!P:# ]\9}=?  Ͱ$G sB0>:$ܒ2WhPb>8]hpA"Ȳo{}a qME37j!h.?eRDlzDAn%b)dJ|U˛z؎S\8}Xx8\$+pݝv0Q}X7 Y#;$cG9a=>$) 1#NRTH[ȶ3tL~.Jo [Y "jX Q2j&HID^bP1&l P^nF)S`+V+ 5*pDѴܷ 1@A F Z9*fn5& @HʕA-UP3mRXUZQ"2&0cMޥ95bt:*L}F x˄ôI7fz\HBqM)"'Nx7̦ѻ tRF Kۻ% rX7nI6#SƐwKA=#λA;ؿwK4ܻ[7nG6%mJ] l'9C !whIx0Lx Dp@xgfDvGt<үxӅܾ!b =|B t<1|xO Uݭ[:b.S*LyI==>ä8sB8c% V`HOl=7~8֚J]a ebc q@yōJ5-)c:S#T.!TfyA-t-TDŽ(ɪ+Egzm t_`T0'F)(3_ATb WτR(Vq$X UqMT`GPfKzϮ?/m],]wHyr „N)/=奇|YtI^zj @T:gLXKl5 ,ئ]0qCCzO|c{yU>)1Rj.m +[q+*]L - >(@Ï4 ]gy&glr(p6&8gZ{ȱ_)tcd.'f& |Y7*kbK=ҏA˪R=(ѲT"y!/)ܪ]hLj9? X~#J9$\6{||6$FXEجai(VP)#!fx4mA(Q2ehd_2FSleKQ6Ȑ$cɓP%2" S= Ɖ6XqeM",bGk,.”q1Ilb(I.AG8PBY9vr6Srjk{N%+kk5jNZݼ']O:n\[jUwA7[|3wq.Ǭ0\2in4J0?srpdE#$>E3NO  cXX]H|bqߣ.GwXVMnzCK 􉡵w`m(=pG[qC-( I/R9H>A0@*?uQ.xw…TgO7;e5衻;Q\yuVjᛂ`tt{31Tm RC=9תn_|] GÛYeۨK]Ս#BK[[as:ctN(4qP,ANP9G3?XEϹA9tFAL]A:Μ/2WQVq0ّLC󙏝€K-XA)PJ$rRQm)8HI5S"vU_j%(;*W>_Gי k"EWE{TL7s/8aI)j$!B6b}H)ϡYFIJ*_L.`N|E$IIDODgىVGqXƍU(gW뭦HDF٪>(Юjer.YHթuuoVS]JEdV+jny[/34zsy8Uʮϟ,QNVvQs^w+^~ "8y "0ADy"]xEQ*O[1RHG|q!H p<20.X}hnl=i%L>|!%6,ṃquh)$SSEU9֭FXk^m*RUtliP Q]!f'k T56{],n4Ui`lv);$5сdE?OէJdFJ~U1i/UhB]A+*w1d0{`&H͗hB~klMlyLa;Ьx-TPCyY+j|e#IM1kTFow[d0ϘIe~}8/AI?wVUq8 F)ABuLv:zGzCNH[s!u<4=TN! 8QvM喪>{CM Ruu(4 _ƅD*"c"kAFjyZؚ\*vyq``$A"I$Ɔ1]86v10(g鄑`^f߻Լ+B)QSa*'@o"fEjc&@yc4q3a;evOk\99tC8 śg9%É3 T#u5[􄀪Lnr`[YhcQhc;1" (ND|v0v)&L߾,x*эѪ1.y&Ng|U3$y!y>YD rgʂ9"_reչloeF> X.NLj!IB4(VS$ET)079g;Lf`&EZ `fbkQ !SXixIbJ!h1 '֗W۽ 0D` JF'A8ǐņ"IlBv+X e CvJFL!4*1bBL"4<XLT5r|u2J3F->xgۍ$M x>C~0\e{PdB`xLU7qkB1}/!-/\t?ç >`ۃi3 ^1v6ys/( sG鞜RcO}湫'w4Kc`4b\ƊttGRe6șS-{-kr>zdי}5|K9Q]$x ǕDyG#@gru<0yEK# XF0ܬoe8SɈNkhl0DUM ·NXPP}x7xYsG[-ae`"6r$b@Mh°d3VQl K/+BDԚ6,_9\5nr0, JvxHS|+rޛ .9ʹ]4ۭ/D5[L#&nu4Ƭ^-jW870Y8r?G|A-حkNKB8 sY[0!6/Wfw#a8=KA* m\y] G:sWѯ?C3vʨG@u4Tm&OoL-z|$1Nbi ph!yR9FiĢDJ'"I=opY;<ۮy\yKav?MB3z`LS'և^zjۗ ~/,*Fdx03? rtnWy'*ŋ;%46&!cB+X#cKaJI znzz'(h ԇwo_޾yW޽~u8+_@jo~|훏/^y˫YͽĿ^/x^.fEa6^5ˢkù?ҎvYB,iz gz͙eS 77_ i:]0 Ib Nt4%M;!s†y*B?N]4ңZ~6f_4pҺ` !؟C Ӌ`ܻ|XHx~fK{)}HP5gSF0̱ ?|!?xP 7m`KN7W+u}9 K7&"s(/ v&.{cw9==Kz2O[x~ZSH^Ot$Y0gkzMo:~t7ufIk M4)wcE|SnIOxRǓ:'%OR=_B>$R:|`T>UA-_Y'D3-沂9? 3)|Y~k|nrڟ7,/ ->K%ֲA$]T&8#J M ZdqBtD H5:! "FjXsVŝ2qAFAƵCx!k=r>!n(gbl@ZNq!20j(-YˆZeG+ ha&8h!ZG`QfT0+%#؀HdB19Gd8~.a}?kϥ{Mxew Ǟ8nb; &Nl2܅&Fš`ٖ :JoQ/|}T[h$'oSn T`0gxF4EToul@{sSFWU(WOZV(W]4T1_ʼn5?nxpG"E ]8VPAU#(8j !X!+E,M)b0&>h6hGz85 ]F┍}5ƛH1իɴk*(*%wK 9܍/KAy\3%.CIdb!zÇ}҆.F6 o_JAEc3SLǿw9CRнc V Qgc#50TG}!/Eӻ?Áb\.V Kw_Ajx-s~~|̨h~m+*}$nwS7quj A[g4+io$19F]ǖ(D?u1x,[?M4X2%O}ev[,j]|5R}eq6rR()V1@J ք^[JǪ_UhSc0hNH)%~\0T'+aPϣy%& "7"ثu{.SJdr}nuE\V2˫"gzzVB4@g%`%B$H(!B0ƅTSNN7u\#xF#"#5AL!{(o!PFd-ڔQ6]$X*ꯛ\49H[wkTb$+3s̺l+ <@!/s|˿/7"DJQHɎuB6X<(:uQźB֭6e[hR>vcFnuyPFu꾣u;_<˔Gn -kݚАo\EkqZ7*y[]Qb2Ѭ[mBZ&4W::ŐhZȮu+pFVeT;Xso1Ϻ&ukBCq}N)0IÅW˱3^{n;I@T/WHEk=G"wnJbِRxJRRBUFR)XggΤ4Qz< t~݂Ut2I|_ZPѢⅰ2,1Қѳ&QV414%Ht'&.9-\u!c$ ع[Єԙ{HڇvӯO-Si5J;9p{>B?q.]v:}cnG&.=yS} 4Yz4=_m=}yO>{KDkz?8 [g4{2sh~~A|e LShS{'Swp1.6#mc=f\8 D' H`;l"A:Nynj\[:~܀:{ra_d1mu` a?f4` "h=@cJt_V\c4̥/m2=/3Fn>=M$96 ^i/WB_1dەjq 4m d$8b0ৌdIOj6_}Gw ?oK_6JLjDF;CAGrڏbc'J sX$ǹ7( 'jr'{Jr=DabH'y4hC! P榣rqmy˷4CSdJL[Eƞ ^ uS<;RP;SJo~lp;Vp޵ܺ3C?<@*Ul@;\%&s*{rK(=,;$8(To9}f~l)6"\mk\4d^ٰ)q?D>W5{7ɨ؈j14y`_@+[R{d$3n7c0ZKcd$<"`k8.I$SA#c%AVʭ1ύ(9O f*GF˔'mE,0 EL0)9C,A:8qbr|q );d: IOV^I$ʛN+&¹M+(JLu%K, GV%HAp!dZ棙pl_No*{70ER|__ea!pgh}n]h0xgO$q>J&=}yd٥[/@]vlF7=m6mufBXDZWn| ycmx9km;_Ҙ;jcjPbH`r'hgI4q0S! M8b"TϷʔ ]k? rc#)9Ѓ&La _м_PEe~AQ*x|9T.;xdz_d{3o(5owbnnבx-:ئċ]RJ"o%v|0']TH7sn}$%I/͜j7iInO͜`j0 ! lm BmYI4J%YN%t@yL64㳦W/"U9"BKpYa n%0vw W(YTgÛ= BAAPD,pB1QG"?s'E1.]kpSaгvF5%*8O'w*DƎC"0Un;h-4tv2\I~OyO g'%$zŏ^_ⷧW/;X9Q-No<Wϟ-U3fKGKKh^%UsH|m;\WnEv9t2.bk7JNt~ltW,{YnルD[nмW}Η ΧnGp],\9C_,mitH Pf5{AFI fp72[4b_O{__hu^ގ6E}W@C haiWp|a}pchPf4oS`p1s=ܳg~Kv5A|jEyNLou ho8x#NQV@i 9=LPdʉPO+qX鵠23joLТo/i+RZ/(7r HK-P~\= 0W43W/Z rYa <{R?yY=Yݷ E7sv;ynSI-me\S5š÷7RuAm ٗEJ# RpB6WU~ܠj"RTE>BN 9%LlK潻IvS@!ZF(ΙW % <%؁ '|yN/Bp"Ok|(%JDhOFB(DVYLi(gE6 IVlǓ/},ΙD"Pa "`%Qloq#˿"oMu@/;{8حiQ0 {E*$Jb*ca-"_{H1"iC&%~}R|~DDC(ofJx*|*b`Oc˜rU|bvU@Ec홵L- 2Cn s^3ȑ0=Sw=1ec0o8 BI8B4S"0œ{Oۇ go8\`AaPPh0&C0Mʥ8_6 X/C8&)>8c~$+_+ ObT7!,[苨.+Œf 7FXQIh2X(ָe][D.JՖsG"fH%XDCLL Ofd(D8¡ .%N4KbI!GLaX[*CM9QC ވ̝e;:i.n,ǨupFdaLCdYi`2Z{o3mkLmEwdnãInLݡ?.b,d'SP{So{lOiOC띵|9~B1jzK)Ϭf_*ӡ=0g}Uy`iA2bx 쾪<ȵs6==«OiRz;ի:.T1!8 4_ihܮ^.@X[`$z]ygB`D<fe>혊v;[a]) z;f|q!(.w4Ìq[wkEق/]mxntAeOA?\%[p}8mM+;|>h~o9Onد2$EA4oL/f/SrNs[ k3{ GŨ&D ɉ6[!+F.{þ e{ocOa,ՇZ&Q*];0gmre6?lа?#u;_\ӊ~ޤ0z.y elÝeAQ݇?)g fnQ~:8 U(WC.ReVt!;%-ɈʅË2= LgfnGq^uNM lyHd B qco;iuN n&y:yY:hyNe嘜`iJ/~\6:d3Z4'>|[u[ζxڱQj])ٺTSRIT*$tJsr5y-֭Htڅu6)(v^/SZ4`/\o^զE+Üܩµ]>vJ=˟VuJ-MOU-H#W پK! ~Q @ B{\_w)|iz"eM `;o;hMo9@ޏkQH&G=toތQ`G}!oiڜ gV ´ c4LEbWM((U1 "WY7I`-02͋ާdYIF0W?nzvuD=?p5&_u{l*Dh~sE6" a/C,qo *6O@7jQ$A5~ D!p\NvŽ'607/?&e[Wi'I`Fц$xozƗ7qlI>_pv b1dA D*qT)"R}NOIkng͝z !fV_7#sqB@ ȗ=1y_RxY Kӈo3Yĕ=nP/MdF{^0>zU.ArQw.ZD,V`.$h +adc4HBXֱ9W3 fR/O,~firwDV][b{==ת%hˋKAާjrEmȈgd322RhMMwTS H?(]CB1C H*ӈZR)yHKID(x$*Y,ztRJ6K`#1 A`b>C[lHÄt&r9fϲƽMko٩\c)L\9^sO57' _\ 6{fʬ۫'v _fc 0(\>@U&Fft c "b$TZRδ%: B0J$2YQ絈t/iH{n"Fmb^D QmDm+C TH 0ov:ti}ƺIIЉU>#k=ųLum[yg-i,1}D%ig&|j0Y2Gf6pbJ_^Űfnus~Iޛ&`Kф69+ seB,Y>N"~$"1 pHhXSJ@PDr0#c"(^3iל$ !M{%[m) R#p9uyhhu|(VGWD.nehhچA0IJpT%%XcZ9zZ/rՑRG+'p}_V^`yXLw^Pr(M:`u/D}a)~l'p}aRM71Y|?/R=U^N1fn@dmL/PHJVGIp{ ovSM\\ Sx?}^lDssfD Z{hfF^vb:Z3ΝҤD$*e 7OؔM|]Rio[VuZͶNI&Y69ìʪ:py=On znZ `'G4I,^ KʙnG%C,F‘Rgb:4p|RƫlfU+.z0;G\'yk TTH5i\!F[斛d'f$wc&Gb6kXw4%e-][-&8[3 eDsua1jE\ &]mtl77Gd6}H6IՍt:,Bޮb̴Wﮁj ʋs2z*9#~`~!I2r)uuhaNݮXqf[9WGre`$,NMlP&V*o>5 Qs6;1u|Oehh7r*Ø 6zbv.G={Y3dK&Ι`Ss,6 *g DzRh:-w$j={!OGFZI5Oos]FcTc%tYgM:fh=+g;[y...D+ʗIֹٜ>~sYhQ\SV6=ZhN8"W>-#2;CG|iH:dEp'?n-Tl;W3M'w=Iw|W|%'jvk,'m.:co/'?%Jj,VҒ05:nor]p(c}3iSJ8`l9:x8Є()Xl'^z2KȹۚSp9Hy8'ȊQEcqs=)3BSıvs~ N=(4Nz9 U2r ( Z0N5PZp %ۊyv*r؍AE5&\Y%|Dr=_; [R:<_e&TR"Q[ISyd$V g(Fm9.&wY2`ry˦{/IXCM-]b`84}.ƣ(@ikO4 8Q,8>arv-WGNdQ-u[g`s%m4jfp5![u5i=`^rAk0nO9D"|m"uW gZ۶Ε6C'N;}ɍAG^$n,z@PN(vX,v1(fD BG)t~>,Ńtf2aXp%8Pd,=)t%HILo}-kH/;,CfDh4̃bTk:[PoCJ-D(]_Uׄ3N&" &z73Bm@giN/p{ݏt a!˰2rnA1娟We` 3>Wݳ8N+5}p(Y>Y"&F,}#K:PQO"-k0WjcQrȌ2D4Nc%6J&Gun雑-fмzd>܎G6+Gw@yn"sDLW8Xꄡb˺~gm0t7۱9# v"ES5(4H7GSlS0ٛc+/Â?Ny2ɗ±=y̒0 9~Fc2!JVAۉpn$dIu#zfXmŀ 16T#`?TQ2!-??Pt"LQŋTNI?|WLR A#~n+> ceŎаLZaG,ʎ&s1Iz \_HX`9:-Y@ p4^O>,@uucc]Jq8pb|8`ZvaO9զz_^1޵E9Mk %C 5әY1Yu~Th;fr>A0ݯ T?5'RRlfDսW=V,עlT^Z6ޟ=6A쮂TlYw{mʌiZxV.:Z Xض@lrFybYθ|mh:lʞl`j:$煺6U/P r[^*r*Gwxz h8P߀n,Ã" )!Ȧv"noWT@)*kegYt65/ysbq8:ߍ-RW WTt%9dUU/M*U0)6Y,_b5;=|~Q/~tbL@;!΢z˄Zpx'v Wpr1Y[qݥI t =|Q} zv3NAI{1 ^0y  wVS,;*'qw |r ҏ`2l9~nղ(v)%thrWޣ❂uOc/xS\Y8){DAS } $1.'o jS2 yxЂfD=-GGj(fb!`KJy0. I8Nˊ5|}uO@?W;8ђ_A2A|sm#\% (Eiʡ@e4%$cgd9!9Dq5W9)B9G)89KR4rH-ƀXIs-eLA!)$44R+ r,̰@QUj8_ڿf: 7fQ4_u:oذȋ$iaȸ) k,Sv~J).+")˻쾌zbоOa3 nx^_?p1/ʈ~` K KJ_WgQ>Kyclq̷0.D\ x07[EL }VFW!29>' |gR $#Dk^Ofm^z!xc o0 2d^ru'ʼC|;#=_}Cc N-7*Inh+kK~ EPNݟ"n-Tps";sO*.6q@YkBL cX4?2ljގ`VJ~s`<+CV8RJ[riY1^߃bY>$5Ӷk;\; )Ma"dfN"C IHT'HsS4ϬC3󜧄T% ~"Lo ~gj>Jfwz12[s[n\zɃGI HLE&?C^lH<@`B{hw&՜|!HBH3%+ܸ?9ńz m2[[L2,a΁aSq6՚tMvvOL"-éΧ)BéD`{)Y|vF%P C' Nv'a aXŤۆr*$oX/f$!',lV6<C4Ke*B0JԳD%Rp|ͳ즛0T<n['̘ ceL8QΨû,G,CBIK=x+a6m+ܶcW0K`I}Of5:z'踈toi%lzÆsmE9wǫ]>PP>o{00cuAYYGF|%@H(nhA?3lcp玶}"׆Ajl,JkĘH S|[!%w,%!Pw@(xk!Ԙ#55%=H9ø' Ar;ͪx)B(P01>*|Mͼ0մjr y8Vr'|[ IVhqv~ܐ~| Ua4'aVNJO.=Dz>!;NICYp7+p8-Y3= GLY[e>K.>% s&]ӌԭ]s^.m'<_щylsE!ݰ4\Kg(y1PM,w ubJYA{B2Xd b!Zse+顭M>aYکRTe=7MÁ2SX9/}cHb,Q#=9nDULGWяŷ(se>0ƓQ2xvgu9@+.2tiMb/e/~Lm{ljEMf}* ^ioq?*"/7JVo÷:!!z2WIMO/3n<`h<9v&xя5~ԘKYVw(#0{ ӬWm!MRu}mCyRrLҩeש̮'ǻ F#luoZPR $o<~<|M+/a|wdxoN<5 q}zf5Eg}Go_yzF^*7SGҮsxvϣ__}2lqI8×yǵQ79{O?`#Aԛ<`b`2񜜝^ucof{n&9lj9iCqPkV^n{ߔ|SV:i4M?fycT9Az|m)Y|giy e%Cr?ܗL]y#\ܐ-|(ΦRP du&b2'8J*K7wI a).%Ad"CY$2I85 93*rvI Jt\<# SIcbRb#1nWJTN OqRPkMդ!33AU.jmKfB D(@PNp.׹,hMr -Q*i8h#Q U.|=I^'CdNC(.HZw d>J o늹,HXgRHRfB}=fW@|FC JH^8u7#=*+UF@z^Dܖ<9k\40%;XQ0L8 tS$פ&VHl-9q%?K`煲iƆ)[s`kx~7znqf{phQVF|~i]Ks eL!g1Ir$Kĸ-Drssy;7ꆄu;+v= lvE*b@1KXc򄘵w G!N&3Fa$s#IF% Q0S" ˡI&s2 M3m- DuY"C䫵[E e%GXnDѲ̱4Bp*Y8Kx(Pn  HX}Bo[0r:-q QAz)֝ޜ=l|ANKߑfk`Vv+gqx"' \#Į1pR3ջlyCnwжNٮ9e9A^۶Zԅ;wMS{%7'"x10V럂mҩK$otTW(& )R؅ed<Fvnuh0?|oD{`4dN)!PR>BDdc<{ݏToIFWosDY Թ/ի hCsdEûo#D _!So|FO!+H\j_ۣb;MLAG@T$hEF5^WV[Q!:dUjaSb7$NY$1Ӥ&]]{*%JЈ+f! ҨG7,h?OsE**&]y|O4XsйN.ŋ ,!K 1$!q܅8dQpOzDz*,*>DH._S=ט0G8݌2 i*nDR^hD)h8Gק :Mj GӬ( uc`D Spf=^@CIÙ0~F70[9\֦V#NJ__?} YOǵ+^WL., "*DFz?/9㿎+G})HQFՑ)Y-HoMaw7 yZvNJU{-js6ox)j/S<\[F d!{CFM0칷=6D]^3|w1htՎ9nQ󆗢\!&Yhb !><$ ; Y#yϦzĪgզUWϬzV';$Ufm{.GNh7>)J`'Iw _4BЊYwt:E;ĴLT*OLob"i͝IrSYE $CxjRHB AA E0Q6d+ep2`΂ :N∘/?+Y>;HګەY^,=ֈᐩ'C@\킀#qBځa`e*mܤRjAi qX `yזwTjS ^IUgO;amB@݈<{2d!PI-Aqģg>:""qHFJeXZiA]z !+a.COWI̴]gHyVtt* [,=& ,TGo  sYDAG] $إ"|,DdO ͒" 6V!`,r1gi^|,<*xz#`#ՊܳUzfU z0'͔W잖ǴT1YNB'  ! ̥2ИAjCZcm4c-ݵ .r*o& Pn=sQ˨*5)ʮv"p(fHMi *mVI< #j;.E'&{rtp<00ɇqU)3M UMt'G;I Kax&Rch` K!Txz߫A-Do)]|OΉV;MhyG-5oxIB+3?%9pc[缣(goQrqKeP"6,ۙ?D|W$x'zP"(r0Ǩ ͷ<_YEYs!EIKSRfRs,3Oߊ-j?,E8Y|P:Ǻϫr5|@J^Ȼ["&6@]oaxӢPUq{">¼Fe;8䙳hOsyUs={Z^Wy*kp~\!+A0~U9ʠt}CJ)x y,ZS.9¢"Xۗy xZuNZ,P_rrXԗԪVum9k9!ϜExVP~/ˡQU !;F%:*ò]ـK.~~q1u0雐;qr$-FN5:E*^L;EfH8p .#Axdb"ieuEj}n!" sHUzz%לNXpuZ [CKvTja;ᖁ+Ӗz[#F %h@f0 FZ FE+ V9/f+3! 8jgStH8K`pZ9kgcU3R>X 'T lhb@5u4\Nf4 y,ZS~'y0azڭ9S:>pД[1ڭ y,ZS |Dv) $^Vw] hIv Yd° VM0 -auz E3c9BFQ!/3P!.5 ,A9wL5afтQ@MD]`ϭWG.<7BPLh`B.wF^ܴ~$V"p8d]rG5í\?]j1sPƅg;H 8E" d 1xvF#˸D@fQ8б&ng FOqI А$3H8 LsBT%1 QBFFW6p2lOm H}=$0] Af.q"l[葿Y&e맷o=!R$~\0[56}4\\DDhFѯd:$W?glr2;jo./a|Zඵ%@I~ztq:tdڴ($@;z*$$4K٢Uix_(/:Ռ7R"0MF|Py #($QHm|lS᪇)(Gq뙖t9fop"zጟMҥ¹>eʥ dnӯyT(gAɈ>hER5'e.Myh2)ts$嗹 @ 5H҉\f4DCqgfFގ7tulXz:l+c:{g<  I;:n@&ߞ&<_qCvSƛ%Yw/= qs[->N{UR]w2Kpa:H2x_|y~u~܄P=Oq*!Hl׊D1 oNJX6m-⧻VmU/l{uX;o.եirpJt8OdEd)-v8 +MRCx.5QZ'/ؔ쁀F$}PnWy.;\Y%oDNLVQGDNHcۅm6Z$zx(bMkWH~Q5^Ի&L8h*|Xr-\RΡR;Q9nX}qu%pJ'FtCF캺T.„̟$hB%;%*f@ unS 7Q{U{b?^$W{lO#ȑ#W&,n$󧝼'xdlMas`"# Qޖ=VI./vDL5l8)P~Kc211` 6OrTb,xb}H͇ Ke]좙ܨv貳7FOKsk~0k$gܒfWΝXl?~Գgy=f1H] J+G:F s*,u%uߒUQU2Es51XPKտjÒtV%Z!i9*{@djR )T m5ex,Xߌ}BuX䑚yZvSXF 9AO:vյ+uux]yj2F(b0ScZQ&ϸs=§F79bHTn\7%v-ܬ%F']|F]az7[+>"D?)A465 U6ԙ?%oFt[ōh|iE >HBk caOA;4*(M A%p ߞ y v%XvՃ>1p8v -5t0DZ!pIR" ~QxN ǯ,a CuKǯv2*|>./(GK}n5./Ssxe"eayaH|mq&^qE4h7}QҀLLA*1Di͸HP Kʼ&U޽h!<ع8_O"N {J@,ҜJ)u>qKTˀܢD[W'6R ts /X?cW&Kn&qTzHH_4IJO\T}'(((h拪)ˌK5B[DS:+)gN`2U*TH%,SU.;P꺃,]v{;@{2F,_=vjQK*cRT瓊yQkK&z˽Z!g9uL{(6vϨRZaf@>\eQ;P-Q;L%,Nu> eDTj  2d &k;JsjBkdO|@mfQӓUN'TSo҂ƗXN@5 IPHMCR$&Ϡ67k$UqH0I#E5ځQɴpG"BUnҨRU3bp~%5+FU.1Zbu}ZlD7TIV[TAK-|#¼Q%c΃2J-QQiӆV#5v7Ũvj%Cú<=XBK%4MXΑSQL0 m:y#Tݹ3G(H@Cghf󿭬e,Ήۯ=bMC8b4WE_0p8~:VAHD߹m;5$ oW#xIGVt^$s \"*O BVۯ=$@Z= 1 T-w-z'*^)-o.rZve:GsOc9 8n`J-~ދZɧfH{0;<,'!2e^~"BP_〬e"`Bz 1U!iApA+WLBq'NN5Ёa|ܚ]xc"4D9^oַ2!2$ T;ɩOC魃[+MBCPΜ"8G^P~k#d)_]eWt{߉LvnO>eKjT^3mJLǤ.[Lsb&ZX#)C!b)hzUL pKMfNF~}dxeHg0_lf[+6u'%*`ZV6T )zV*űإ~t3B=gO=M-ex6Llufe (-cE!1i9WXXdD`G,4"VƬ8 ^>Y*d'>gKDt1gvJxoͤ{/>k*wsg]`` Xdcx wuDQ>p;5vgrFrgJ6 _C ~8kfjTU6 82-헟/f'UB ~ fv<, p3 3?:hQFW]U`òɶƣhXo  1|+8ϊGenP"Nv?*sH}>A~T:i*asQEmޓ";DIsGZηmOkkǖ3)^{x#RBp,$2Ȣ%Zw]w>uo&| r.&SA3ݤ:f9 bWB$qرzU:WSY1@( Y4k86N':aĪ'Ql.>Poޡh &"4b򥄯_Yc2WUD;(6GS?@5H҄ŽAn lGj'c.r{Jm7e]}.w$0 F)稜)@ ^:4&Ss3lXMۏ@p?ufs##ck-a;#YouHSITPk^!=c)b)> {Wm^u׷owێҩ08qiNiB4zsپ۞-n{ {Nd L"|aXD5!i(ܮoף[X20w%rZw_8P pXÐQ{z2V]ms䶑+SrwVomR\˥ԎYkfwI_sf$Qy_@r~WFwĺ/Fe.Q4?4mM2Oe6wOYq0[20ۗ+2Sه_֫GfxN"ՙ~Bh ԐBo|uka9p'/r{_|&q&ⳂYj@( H{ yYAސzECٴtQ3$%Zx7eIzۺ҂gQ1_BqP/v8~[{=i nfnG?~BLRf~6]^)s?TDE\3͘dW_µ#Da^jzTv]h" Bg@ L08Sh񆃅& rqvJZj.5d3)e(Ti)˝E͝ 98"OC$#(Ljta5)r?g1ϗD;8n䖾 \VBNC(ݝ0\Fzo>9ß7 9SК&;P)Fmf\PÝ,Nƻ啿"J 7` Tg tvgOpvhV31krZq TN@'qHs:qӈtT%K2i@JrA`XѹGbdb`+)w[C\(i 傲\iO.& L, !><G~js^DK'UQƸ%!Q"8C #H&rPe2)ÿ#8̻CQ&ɼU3JW;&qjLgv`7Mp;jn6}.&JOJJI˩8x@?6}Qzu?3Vktޖ]9?ߕ]9?U ~ N $g\y3yNLmjť0L]oEf-ӆ7X/71y̶q: ;?ݗI4a5|s~Z=f?LhBΊ&EE4EȮjV4</pKr CaJj!U0; Lw9X.eNe[8_(gcKbZࠚ"Bs˥6;)%Z(S9q]tCE&ު]-=#=낌ܔ(Skp5vEVYύ)ZڜY3J@l4M*ܲ~xdx$@w]I \gϕ FRcx oOHy'm׊_6+4`%X; 2-4 7:XhC2 ,x$ ,=[Pr8/L.vN9E6I{&1U9n BJq'MJ헇~>?9dA}AaxεhAqXhI(bcNq])56 Y>ܯσ5$ FZf^IeH!C!LrD)Mn-E@슜)^勏)p<p6k'ѣaAG\ 'm <.coX-ooQS]'3@(QX{@\Bs엍3:X M|󖡕0yH4<74hGub:/U.h죴A[~ >n~_Z&ƯV,T둅8@G^tR:XRJ>|Z}݊0CC8@I%qHlŋCi8@~XݴutG2V7mǠ=6T"WGS{-TVL KQV q3-pPOWܤzu(x h9 ߥ+R 4 B !#hmO[aĴ:Rf!|CC5=etAtjq0f(eh%gthjU@ tdj# Gf[žR.R \ #/ cN+s! ȱQxMkR_z>+ Wv]~Zg;zxdsiV}C}V+ ,ugc|=je:+ߞx1d*]X 51lHYnF 6д"C0Q\IL0ν|?;Vn3FN\nyʶ]+G "$㳽/frqY3S蒨ώe!|va^}0C_g;cyʴ/>f)ixF|BS?z3ްN+ʨafZ8?vJ^a y*}!fT>/RlyGOny[3QC,Do݌_FA}1-$I<=G[Gzv8:x8:c qm#FRqt{#3W0U=AسCᣜ=wڞ~4ku|zn{c4\:"=;>)FW>|}5R<5t(~vUlUdd ǸOdžYObVIy=>5 u;t>zzgo_AR;RWyw*0nƨ*#BNkј `c$f*ئQ/CTcN,ga K9s/ Jyq70|IYفq'S>?D\(ڻg_>4c_~}W^ܶl]Wq*.z!B8ETZrnB}<,F'w9cTr͖IA̗Q5h2s6W~}uj58-F4i8iGhf"T$~^vMQ= h4LHEr?Nڔ8Gs"?vUxEr"P?>Fq"Т?6UEqr9# #JsQ4"9_E:d˞WGͦ^KcE!z|sq>R(#߸h$b_5z]$+Jc;vҧU^7G|V{_|Èi @AφA{$\p“*0o C^$\0z"-38z ),g5Bah7Pw9CZ8/م6Ȝk^}5ARY{Kr:HgvI81Z@t7ڦI;CefIjmBxw˥0pu.nDP]/U>+40JE ($"`*K< F ?|<\;)H ʻ]S9E*ʿ 5*,_L TEvJNS} u*?5"pr-L_oETu~J8$m>S9 83':{휅\S=b uR M]jH|:q>Ym9)<<] J Mk.ն1YR])gA옂n#n3'. t~?ŻT%Si{F-z)`I&L+Jߕsy9[Emp.Gb"I`l}AJk&Sva~5_ܖR>sxT~tzO?-xFVD*\xV5Ymmvb@[.un[5dEwRx8U AT(ZE" -Z*4 N c U˲`o.u4xRosQl ]|L3g{6XWeގjm,gl%FplGk]7<4( Rf&E) eAPY2LІ( ֊T0!B7̄yqo-Cbт+A!kU -A1ƕGDžyzg "v eQSO5\+CLӢ $(#P@!bL$bc4D67(rflBjE\GSR<18ۙ#XqN *|C'J\ de(YLӋrg,SޢƃF_۹SWξحG1&$`4K!#G'Te1PDw;JǾxnʓ ?ϓbz'w̱s^ij5H bwAa&\^b-cDc$c|&ϖL*N!oB$V12"+f  r]/py1:ny1fG!0?> :\^,bKrT =a`-1̛B~ZĔFibK1̙9+GʃZ"x'\8,S& mhe:k`KTZ="W~7C}!tspOt@0[l*#Ja$3b Tx!C R'_uӳ~9lI JONVmsYiWe쇟~ݭL Y$ Rs6޹/R30L$ss%غaM7ZىRCV؆RgdK luf2J+]AK-BԆRg'0KMj9٪\Vve/FD2lRφAzC8ȼx|߸h>LF+_ GӔJsyM}L/N`:ש|!sӀ^Q/A4 *QL&N yt$zzrƻ0r27f,M4Ǧ4R{ϡD7Lq[.uL'DEIo-WD\ N=։n[gjߵT35a!nY6U+kэ!*^ebPkQ#k:eTlo-/[r&eSB}At# }n1{vWM:hk-ftkBDlr6nwŠ֢F=Zw](ݲ]wkBDlJA~G"bPtrNta_9m-pքfٔm)F>w}MzHкv]tݤ'7L:Oe~ulJaTl*Zr.H->F=Aֶ6˧]V׳ը'hպ[$D񮿧i0t_{A8 u ]/H 7ރN]@7Ф'T%і8!['t5C1sB1m]yWcj̍zoaqWcj̍zu5f u5ܤ'PBHjӮ՘.xjTj՘s0j_:j]y}5fFxBNWc>3]֘9P]17 YFw5f=A֘!R]17 !پ1w5f=jվDt5ܨ'H־}fJ՘sn_YFsWcnf՘S1w5F=s޾(՘sP}}kj̲sWcn-Ǭc]17 ;1+QWcj̍zǸ}5f%qwVvWcnҌP9%}jv&OdNI3)Fw0benQ}E3xD.`6˓3>ŭ":W8^{uK!`8\A$)ù r1\=ˏ o#>fo#}rXSb(-ǿ19+gKVZ [ͧ~4)zٷ/^C&'<魎:+:XQgWa4xxLM}A2'V0k$y C@ DuPÈ^ŝWhBWA}E NXȇa!DSW Hqk\_EYW1Mׂ2(ꤏ!8)e w^Gs1a68E/ɄZ!(J`i߄x Ex y/&IBJﭳk3!TbȺAM`:UŽ)q1b^s ;H34P!4BM%Q%Ga pI1R0 20h7y,k`Rh8x4ec-P053SXN02!_B,Ds`]~0L0'#*Xb'`G:)s,xsD&!4b!0tf):łU0 B0&[ Dd'|JqJ]50>vM!C,~3ɽ~If'(/U>=9G͠rrܪ?]8O2ēgUK8*&fypfX0_/<*~ A S b4 x;Cj-+Oxb [ʸ09nf)r Q#;|_wWՓx!;s 9Bv5~8Lk"]|R%put!`<M@S1@Ucק_r0@UFC7 7 DOg7Khƃ)q tp~^Wr+z;1 ?޼Gѱ+uumn=Gyj!gQ`*Zd/ʠt{wXj?"SDi`e}LqJ`zM|=˶Oc3*QeEPsޅ4!6Do;;@x94TJ<}{SLf`nw8YTЦTH͠54b]Ζ0TN\46Ѵlzʝ1 5bްכw=l{;|?}ݷJn>y&E_w>*ι[dlLZ'};ijR3DXy/^Qy4Y#]`X;,3~;GI c|y;c'Kk/=X(Y};mm&UUMƋFa:F"t.ZnDM$J[j)Jb :}`LdZUWuW6*lnYƜ9ao o W۴ |pMU9oi-{/ي"6>̂4V(hAyGEqĜDMe}kiM/jes:-K%u8ַ߆֧R#[ >YT}3ac3n2}0ͅPZ /7U)J~(6' lVDʴЌ#T+CJK&G7ڦ ĥ}Xk pe֛k>rHU )TRT辥JأtZ0'q[yNHU# AD:m# \QtnT !ҢjYzDžV8,4_|e!>@*oC4i6{+bbBnF jv:Sg;8dc1΀D16b|!1AJ VIkHp2l*|HMG{_+M_C\SBY>Yˇ9sS 7&j0Z f !LM'm(%I+L_AZ|r VjGp "=jimfjK; fӰvWXWjPۤ֔QLk:Qvx\^2 תVU"G3>M뎏"l nպ@44ǻއ_ r<%Qjn0#U̯jfUeG@f$:d _aG2x7= iE^K9p&V$Hųw_rjm ߳#񂗨(uT?{Wq / Gc>}Xy/rHVOD$5Fwg5qLai/ʪLϮO-lC?)`$')>II5M #x$]fm N,%>X"$9|Ju cƓtEEWLe-9j6^Ɠ૩6?mS,*o2=sl0xw6W)4:[!̞q^1/ TG4`pESn'wHg\0QP{v)|]-g.kjT+,~V"b{Q4 ~*G)Jc'|ǧ1>0Lv*Eݼ[Oҹn{ϣ(6-* "# J(0@QvM͋ sԡm_Bki7:s_^YFU ^#5g WHXI:v$Ҟ%Y֗Ξf~)\݌pY T@W$ŷ{q2 #!z ދc;W%̞.fRq{jB֣z;G/II9L 䙻pX-1.;6]#5uX+|~&et%wu 2˄w7 @b oH~/dψ|vo\$Z]H5] r=)aFw s;LZ z~vheXtR4ćYpww&Dr[((J+?MlEGI4ϡQRq f+bC ъl%lfEf s4X]+ ʌc6VqN[J]1Oa~{%6Rs% [6"pPjGfBxkyt 1n>/?ziK+&2z~9 gqw*.qG܄g_;o#ͮ.noB$ݴ@(-*]{Ot+KEM?WQ@r'|\mxJ;nRԶ v*4V6wlaPIh| Q Nv{i'|ҚQNH.$R"5*GI6), TejC=s. ryRBz'T\.4ՠ[rcdZv1lH UqIODWa]%HvkA"e:z #SIr21aUe9|O/ǜ)UJ6IHW4NIksWr$'GbMB X #6V<5KEz3F{섒 K UZ%e_]0Վ(dR#J ZS ĢfD `C[Fp#,Ƃfb$dQcIIc.ۛSj#%; bifLcV<o~4{vp$ ֩MF_eg4XP(V PYhgg7go\E)BTEz5> 탣X"W0kʸoF1?xVfa5PQ]P o" i@T@w:5c5EC)>h 6 ^{ys"{S^{+$$$T9ZxKbIe@8ʬrB`MFi"0ʱ18]xD 5|;T&+vjc蚍XpˀSyf-7B3O y-0i/#78 HyC/Ăkb{y=ěEC!Ƥ,.dJa#jd-`^~MZ}u)<|$M+$jGU/k9oY_Wm9Keߞ6z[-4qAtb^uZVjO9kO.;^FK'If#BH!I/׫!J/HǷe].;kRT^7PD0ou #ev*#)gsf9VXV I2{ew,gm˄ԁ*MK\Rg$~֫_NHTy(T m'2V_% HRoe$*bJ\d`"`-4`xAJp?Iun|jREqdXXӣ Rw߯yOôbO 'Dl-(?n,ADWI~gx5͗pp}}u{wʋ`"ۣKltmf/#S*1tp|+@tJ=\!qXoPq>w'3MH7E7 RR"s{SV{{t#+f_Y\0ƕּFZ 'i7!<ƕqE{Z>9 _I5׽/&)xݴڸ:T]'ۂJcj_~[Wrw%{ ™6"?cD_y[49hI^ѦwwN9j-VX"SNd4xl/x`"K (Ꝥb\:\#^.upq5O?tR#f&o/w4 hys8>+3nGxVcY.lCX ]5XJ/B.EBqMT&J~` K#2sq84ӱpޑ %B KcA5 *rT)Pg ~RzT N8$.fF.S.4T%$JʱNSV%BZw K(i9fJJ\9Y[]MOܬV9"G'MOr5eg/s6߅JU&]|Leŧ&K'Ɏ`/>2\* ÀȾ%{`RV|8fg]hRwRJ#^ ''mH=]s}'r'oI~L(.Zhmxegւg͘>r I38%zu7\M5+rX~iat0ޡp]Q#|JVHWX.I/bTrf­u4h_fX`vo %"03}nhr82!J$]zۋ…[ݚ˕K[XewF UӳvcNO~M5YBwSu'`ڣ{D.fs0w\[<9QaY,bQՁ7\gilA$*UY꾼VLrpSЋ0_~6^w^xNklUm0a*X򠕲w_d/اdvl_.7-cGe~1c:<:謫U.3 H޼ʥhLrFx?GhG/DխnVK*9_vŏjO!zu$cwp ]C')(2$%]uJ]zԗ}u|j}zx@dIm\CsO47y9&\+g,]Bu˾ˆs=W^~xԨ'%<{ŧGmﳇ!0\yU{cd:- wE{IN'Q&ai5|UH; '8Ae[YAKne`eQx/v:+}z_K:kxGv~/!h{κT2C0Mgb-!vWѳ6R-> 4Ä#v0P)M 2"LI^uFO%/ǝy_ e5ӈ- y[TmMcW̸ϯ_,;ٯSImWHka[U4A9Z|͞Ӟ9cSza" ~KAer9o~xs'J7'X3*ǩ+ߖiM=KߎZ+W~U_iLA 0Bp`G,t6TօIqHpˋ o :đRE, EV3*~lhɍ'~NŧFCRuV@|Tj:ꍋ:L㺫lrXT}pV*G Hv(ĺBHm(a@Tx_zF?7]~ýjjCg>hȢ6{9<Ӕ.Ee/5/J '$^04d NIc~ JkojV%DXngaz(!=> "9T}JzHJCՇ2{2̆7V#* =8kȵl rh>j fJ g_{k/ӤR ďCpu'6sI7}nƻޖPKo>n?Bͭ&4^~s.s](G1f>̫/K_x/yȃ5Ӊsři W7\7'нΖ Vqw', NZ ({SɝRǎ D#&ĝ-L,~3[ O+2$8$lL")` NlZ"4-pAyȲ,8C¬& kf'J~fO=$M-[$$LU+@tKO9-@Y "c^%FFRx0Ivfk~' \ LŲZn  8[oPpH̚8]~}t d;46*mF1>rRe;U &7cCEő5d[S߶zba]^bcLyhJ8XoI|lyYݒG Oe"igp% np!XW1ohsp%FmEjZ5oгFnoqR*Z0ACȖK֝|g&1&e͙偑|M'P!T$E%JuP ĩsB Q,Eb-Har_'I<-Ir3Zf7h,=J wg#Izwa32D$lJ}S$Y%/9)BS0%mTF-O}Laզ'`J$:^ ;L~եT!ԭlO) ֢MkUnwx-+&;؂Dn} 8 8n]~1mI9͕j(*HuLkJe`%:!?(٠ z-dYۓz̭ݭ37*Y;[n[;ng|$qz;^Xz̼ni/|>%uk}#_pĈvEŚ;"rP9"qnӋs:*AQep"UnuCP= :` ,[.wl+uO.L Tp**"9EeLkn< Lz.̏S+shM8p#-bHuW28*uaK#$>@c-.݃ Es(Wsђ>t)9Hfo A(3h5)b()UT(>($l3UmChIk5ڎo͛yS8iojvEgNw󎳆\W0{t`?CM֠w}Ijbg h_qʻ,nQktus+RN[5Apm ָ{xtQ0-8DJ;\1%e$V3{XysU}Jy0l]o/XxIWMc=mm oY=p U0Z/}N:ι{ݮQM[PAzZf,fIYݚd'c,:fU]$ZuX7GVqn [GlpNnCPN>n-S0]zĥzX\: ք$@j[/} 9iC5gO-ӆHz_1,6RFP*P6ГF(-~p4 Z(C3`Qzo%Aj0dq,Xi㈐jTf3=g֤r2vh: R*yVoo#kZoE<ȵ-岵vL ]4W|{7~y}EAwr G.~G`4mVbŪ~˙]MRo68U"KFӴuEghaVaSC+_ǯB$OTLcʸ5S:T+J4?W5i+gC Q-e|Sn!r ߈8Kuj\`Q`a0"B69ˑM Vr4D"baSS)b2`M(2@ 1N1% ̍dž vSV$[&曋M&"-cUj:5|3^` M\TN^\6r䒋 of9n N]!{@moz[_e$=/-#!-ay=&a|qrY{M(E;WN"1icp:cp^ &AR e,=q  8 J"CFS͙b$ie+Rd%|{+ɕu֛買nT[?-jnf=[WRbC:&=DntS'J,̟#]n(H+?2:몺ʬDD[띷Hjj +0 E݁Jϒe6EY27U7J~S-S*ͫ%궡=:aMqR`]‘ )SX~J>`,L0)\PtZѾ T74b(KTS}WhKv-(#Xm3hDW3%gݵ$!Q1TvK$PW0%+eXіe1'8'T-j$Z:0<EB*U(*w S- Z (\'"yR4^0ZN#AE(-}Bjc(b91 0nػOf[ݺjCVK/GDs@q>aʱeCo+JUe1meQs'۳*GŊQAق3GiDyXÿ?7DN$2WKZ~zg8%:\R^! 2S5U~.{>}$ʿDҙ7:#=G<Щ;5ՊSNk/Y +Gf(*(vXXE-0ժ@yeJ ϩ%gTHͰ) ;"Gc22 #)"<|X cEQXVx^\AtRF&89)d$i||m'} nI%(_q2!z8F(tJ0yl| 0kqcKKN$jk=gaLժGIޝ-˵b;L>ǤZJRi .Yx hAo! ZxO0rpx Il AO$@@xl?RYa a;L ?bl C1+F(n -G H,`Xvm~`nW]`AVVh6x5J滟t$A`0A6xﮊP"+ `c޳]=D͊NmCˮJ/<:G`vrr›x%?s;^hxϐ'hTidcXx 6xON8uI߁H&RS}rGy$\l1/r"H+,R:'֛\!QaTX9e )d!5D)n騵"䨆,7Kk/|tSDI$𤏘{e܎Τx񛭃 & ade|3!Eb_1h "'rpTvL[BTuO]QhPDнQhBXLQhqD: Aᭀ,{$DpJzϘTR Fr-1㐧͏bQtݹ%"< d-4kITB4g}BJjgH /2U1R(+dl-u9sNB:J 1 hHqFz1p1GnIOOV"ss.).(GPH =lϒ'5G އ‏~H#%('G,8/ a=훿9{M> pwn~&\JV)QD߹7^R%4hػ6 b/MCQyl2l$T~K>H P+NKul{5b_uUK(T_y% Z3Y$s&ajJC$$ʪ0έAG{^Qͻ ^l[0Ztz߃nbk{8t`WgLi՛d?\Hcy@MXI-0&ӳU:ߗݸpg>Si%_sc`:N,n27xi@3)oY2! y""SMRi#:{n'Bw[Fj&$䕋hL)~v,lTĈNu)LˍnMH+ѣeJa4$^tPr+y3LJ-x7&LJS{!QL?ٌzuD{`y Ž/k 'btȸy"2c f`4Up "9*HJiNKdwEE")L Ʀe\|lwX)#$0t$&S w1g?I6Nه-Ȫ}͜[A2 7zZ)֕ 8e}d2ݕ'BaK*]Nkx>]֬&DSxvґ,Y=H`J^zJćm$*3`Kͧ3mH*N54xǿUiO9AKtte,PلĥX Q.UɁh-ZP]n> t+?3:,uQC /V~.Dn͝>4FqaφwfE5.υUg0ӨaOB V, Ԙ `,2Pg%sA|N7' GbϑV 夥9[{Њu%{|-^%"@4pf!i,IƝ6;Eٞ=Z~T~4S v9{ԹK>W%ɿ>>4c.B8 _X㊎1)iKG뒈X|K֭'xxSl:\$'.4 Ʊ׋CCך$1mxlEY+tÄ|sR&ai/v).j (V}&T!חތi۱#.aڒ”V4CK`M$! &yDd%C gԝrKx,Xud1 %rW7p49ThüHŴR M|[zIgnLai9dޞ7wXNYVɎ< FKq(ЎŷJ)c&%sb>W/ϫ!p1Bp aDC8Y} 3ε}Oo~AS ]n$q3m5l[#t7ka6GVzk-P<IIŴF](넆`>FᥨI. 8`A.YV͐lR%8bG=z}cR1 ŒrCP={"${@7Txn걋!3ߗD a vʹV osf/ED߯ @|munp[݄z[U!n.r\<״ \;mA= eU`+bԍ͏gv[rIhr#@3V%xٽ lE'ў 宮"F c5pXjD[^n#ޯF aV01т\h &jk]r0L ֡ar'p`K0o,)2L?/vD>/|$=OL; *6ukEmAz?JQ21I\0tQApZC>\U(?&`C &) )IXyYo\ }CMz!mP !H89>j.a.57dn7}p["ӛW+%ɶsvR; @)"ba%2qzpAvs Fo"bCl={w7G o??n=pӼ;|D~s%!5j񮉂|sLYwC߹Z%=_&en 3$.f⥚+Fq{zW{6~r2 u³ԙ+zh55n=Q,FBG/>Uc I'U0o9A~Ъ&qDZ0N'GGHM?_GxbT^w5z]^z~wP|G7Y՝ѓ=E_!d'?=9O)~FeXƅyfossq}ӯH ٗBX@W(e݁˦>Izhdz~tGн94*8ge<P=t-[.Z?\DdGY.W31{LbA|0uTz,[;cKwM‹pax!ihʕUJjDz71X#R^0jk!c hLb9 Z=#ɟ3Jg-3N+f ŨgOƝwٞv *#觹kV4sE`m끯5y4Gz0`[qKl7pC5ayWo_Sw ev\G?ݯ. ҧn*Hr]wɕ ZzWͿz\=xT6)\['*d5hJ\kZ:8, 7<[dc(T'ǁH> }…\#*s9Se-}Q\?ySbG-<ElRb9r^@sZ &heMg`:-i2;  ̠VwϠ_z K{~\#;RΕe jY#N$K`^GPޭ/L m"gG$Қvߠ4Sۙ qj?LCrZZ'繤h#JMv>&ƵuEFĜ&h Ül(H[#U 0gUy(aew*^e}-~t{nn_^4q<#b-lph܇aj`JwR֍F/=K$AK6W-\y`JC :=(m%8ȓūBi[c'.kB`*t=ycFiR^U4,{khg/O# g+àf/F9J+5jڊvWn0V}1xw)ׇWqzzM|!l,>Z9 jĒ&w.acg|4CcޫM:_Hz]`lV=YlI,a*  #,Kwx" !Ѓ?`Qʗ{f**)s!%j2E]BgF^zZiQmn(@zBHu# 2c"ܴaF@sx:L#R~}F{˼-pBiI5S" %e1BCդ] LfzCgab,z15ZM+ p'Ml o! o]L{r<%Y8-, 9H2=Vv_]veswesA)%1L;1ar)*$&mhteaKČG˸>!:F&- ym6g"ԦI'=131 0N~5*N["L-$Ke-9G" 9-ѓs,cH0[sȏ@ d$rxQрչ9 7h!nI+YhR 䕯5bV6Q&L̥"` ZȳʴQ 4xB#Mm*6KRjszۭ/V .li[u֝fiWilh"$=DL!E76|u[oׅw܎0aGqW` 0.V!latѵ5Boqh& r ]Td]Lw,!G3jQC-!([MFalx))\uJzxWո5H$vG[:[,ۧ1P˒N5Fg슪z[jF~ ҊHpLj4֎K:&}dwy4%S~ P`|dXU߹6ʿcݐFQi,{%S˫ۥ`]/ iloֈ CbnX`i P0|Zyg)Kfgz`;z 0gWۓ޵F^l#ܖp0}QuW_Aw6fWHie&3Z'[:kL`}avOfCɺf 1%`ȢfGg%)VkC)oa:7'!Wx{[9\IiU2uh;? DlgWPO]j9ف.KuiAv<,n{MY?*5֔USWOa>NjrӚsx sFzdf01l;Zwuov\16{K{K n0Sбx?_h-Õx7Z "ۣH3'61FIq`c<ʴS3V}n +^\/*6o|e%`Ҹn*(PcUBO*,*^ 4 wOO*@Gт?ssq}ӯH.]j ̯nΤH}i{:7wT ~}ճTdTbfTͳdO$+Y$cYa]**I5*sg4Es|w\1wGWͿ\=xTj;f%\VG89?'O 㪶'׉,8%b<}WY1!,2Ѱ,Z\)=IJѻIcHY }f`% ~[C1۫B":9l0(J8ל8(E0ZO9GV-rX  s֩Rzn{EߑfHĹ$tŞIʄ@`srxMy4ǫ&= |:_O|E+m.MZM.,y*tvC<8[֓C{1EC0eH{gdp3J҃ |LN3Vi87h?d\_|Z|j<`eQ\}Y7-]߼QfYAyFg)C(D^mj] zmey4/1ed.rMā%m/F*{3(D*+Uqd/>r8"rf!xۯz2eưgfZ$?6eOJϽ)=-RzZ- k\gϝn^xb~xL^ "O<6sm\uO߁z|oFŹ4lRڨU讚8A81TFt;43@h-̇6tCpb%=%L?3}/?;KAaUƑM®Soҟ1s6 ,d=Caun[>XpU>(D+kp~鷋Yc{6r$/e>E2@>~`M["˞L_QbracbXD7;tp Ҍ@~j.Ǟ9'6u&HLe߁m|6ʌ) ve"EՋ$uYv$B0:kO]~t aͅ L Bz1p%ZuvlBb;mϗ_ yv^NQݒѝyt*8JKAD"UXr"(3P>*JR NC5jv&nI: sx2YRhWiIXYiG=VF*)yt҇Bf<4^ٌ6bmKV!G)ur{ !Q_@|Aat %S#&FM *Ʌ*"3 `̖ oDi@,ddFޭ/pkv&ݪ~ր:B*3V4?vD Y#1͖?1fg˕cɲlm `%͉U$iJؚ7.u/Cw8ODYFlD拔t@ }Y߯KDQ&[݀ {F1Bny 'ЂI;1/k~ J10_DE:lw2Ш"T\#XXck0jTZY!Ŧ`şdur}B%7r3t~tRRk,YVQzn5oToV=N LP% gn̑ һ*A :FWJAx(yUrұJlJA7O`*2 D}*^ǜA-UK_&^8IiM9\ۤ7)ȕ4coo ѧ9cO;[l{EluQdhhE{^*9hf]ߪLR)@@Jlvb|/em/ `?8 .4tsKm/#?_o&VtX "z+4\r j׭͢v*f#2{} ܆U$RRpR)hرƳ:-i)m~IoģxDf默ěxC(h ~GgG\z/6̛\!9^eAEɸrZrAx~¨ʹRaFqy^ͤVɻJ[6}n|SOFZyk8q8h AD[[1knT5SVqVteQ¡XAVfesuF˕:;\m)ilt(m|;&4e|7R)lEpW_`l[b}^3~sL@UqR=NLnoﮦ9'[rӫ޽? cw}6gOOߒH@r:4WܥKD6֥gIy_gYq"*c=`${5Ɨhc - &"xrl7/{xMXpYϼ lV܎Cu;)xVgGěxÕUV|7eA6x:ZR^WD`!ZLc@3JjXMH7ŋϰN?Ƴkr.wv9C}mWɸ5:C8#H-| 4RăPPZ{{7 7LG,P $t$#RʶK6&l!N{;iAxs6 %dNm/ŔסAL2QY`}Okk }2Hco Ew[rܛkhY+O&v*{$/xW~ΰl]P<(;Bzx9]>*l<[Qn&›B<~P5V74w]$"E-EvGJ$04rAB 5B=|@eshPvm_,9RA]@,gelVhD"2b,Gar`ǂCMzĤ\"* alDI۾雡d[lEIij3KbgֵL~_q{O8}[}OkEm3h~dһ!k&aIjmư=NkqjGƤFve'Ps ]ykfj-ԭ*,7E3܀5ͶMw`ZmAx:MaoKn{!K_J![G.W7F_]άn$}ݟ-_sɊnr|8ȹp_wO\Owׁ<υK~kMtɺ6NnmטCz^ٯȞyNP$g-"g=GcZxf YޱƉk_mL3)񎞿:[524} :p &}5Q<Q[P3vY{'#h\vQԓFZ<wЦ6sL)md\*JI%K@(i)ġ2N8+ޯv6$VV@[S|]2{, 5"pk%j /+B#|L(N 8Ty'Ίh25"ZkZfw5#,mE-JmIk<^y&}4F_"cJo+ui(%z|E3_̮xB0 [ئd:Gh;@-j|p[$`z#o6.ugE~A(zxWbת__߄I@'ᒞga,FG}pT+kvهkΒ `h5kr u2P[޳+=4@hHI3jw&r3^˸,:ߦ/2h]䬴ߴ-3FNz1HuWoBRY"/~v5$>#a.F-8am+ܲ6KXğYbmOo&UXįV׵pՀtj[ 1t7 >oe5<0_Z*qIs<ҠA; YB dEY2ܔر+ ޵=]xF>L7榒^wb\oLj$TFՃ1ir΀{Xk!YL7c B!FkmOAX8ВW\Ҕ+9-IV`oF, he-RNX/Y JV,22j7YYq-+\E?Bv]|M>x@{ ݌pn"ЫImKH*GdQzJطMGMpQḵN"UDу hU0/TeنRGvjDmA+/P]ifPA:oZvÕ0|#h%H/dflVđA)ZR42r}zkCV5Z=ǜ[_jx2%gؖnRZ)@`Uq%N6NFhX+9> ؑsJmVL>nÇ<ڝL7PR)u< ϟVxX^O{s64a*&(< OhM+qV Ƣpk{[Os, -  jkw_g!bՅ6[Hu;YSE2Z +hHG64;G _TYq"NicŎ5Þ/0Iش%&CmlWZ劜W1@*tdm"Q;3Rh҇s؋b˕]\޴a}0exCI2UڷC<c4YdbYL"zƇ`w8r$Z02n8^Ɠ۱}D[pN!dºgPx19u݆?jR(3P!!"B͖W2\4/ y(@bjH]U Z;xiq8=2Й؋u|~h%Do\!7ypD2; Dk<[ޕ#=zdO7*JRV-UK=hod >o8=<>،gv8ƀSGDca`gOF("ERde'ZG 0 ^/Mpm@8DL|{̓}{~ O QWvzXw^kJD[;̘9>{^LMuggǷYH;OܟUEZ, ;;W%=Br?.W~ڀ;[E`(i}:PvO;!l%"N"eu 98 ݗ{MN>-1߲+<1E@)U76BU6I{t:ׯO>F#@pGM[ȏ"g8zم&1 _0r rzf7"z3g|D秃>W؆IιF(nS2ѪyS$dژkk`ϜApw>؂)S{^S9S5qp:Vmؼ\QQ. M5&xNMel=LFƅlyaŚDN4Z{3>2Hj=n[,W"ۭ+vŮs ժzT\0 2.vCn_DS=ʳ_&uO;許J$}gcf13(Lױ=F Ӷӎ^D >4 m@~/wxln ɺVz.(%/oo6zM.nvA6G8ޝ<UZlуh6{>iӳՍSPW'Ct7Dy^!./׾ |kMdDFcov$yEgy7Q47G4FG' V3 } Ym3 c;᎛+u9_سP2^N$u|˭]6p*)}YX5WG[>Zch_]bc+j N+(HsU_w {4M3O^] cz>O5^4!I報@إ|}#+q~0 t% -Vy=\8Ӭ5-m߷ѡpj(jֹ*N#p= hUYKA7˹}w#U𪤂$/͘ X\UW} =\)IԑsHw!GX ܠ\{s&y^cڈ\?r>x9) a!DŞ9bxe_w Q+lP,.jH]i6i#2LBhM˞9c1U k_a m +G5>(RNyf(4QǥSg|Dpb=߆0ٔraTEpsUU5$.DgW~|)JhoRYv,c1`U_L.8N{⇗GAt6WP p1+7!6#@.B %DVj.Q+tUqkݭI0=tC癖*36*(>٢͈{li{u㐜@OdRɤ\ ޞAv7o[=uɩ&xxƧ{U|(='=-(԰ 5m,ڸ=tmv6g94um688|9qU/qEDai";|Y~>L'j$7+z"$=zY2^"Vm2m(9 ä}VJ18]Poyl U$WIf3Zkygg3h:x44ZB'HSiTCU#x%0x 2]sLv8hﴟ}]+Cw.z%'貯K'W \O8rin-mپ9_6E|t g,`i@!5Ҟ׾?gz}LL;Oƭ2qxIĞ9cezw|* R+5딘>i`dz r[ wB㜐K.T_v6ᓼVՀ{ӷ<$CK˖Y[O. z~ R {Sa'!ׇRuXvl$&A3g|hZC4o,띳ql}juk-|y: W'iز~Vak.RWegCZ$T|Y:Nc}9q?=B 76́`_[k9j)(dzO]$0OrB<'$nnny)íOwn@3HL) :;r^7  ;cz}zi|'\=`/wsN]j s9c{잘RӜ)O8LjBWΕč 4@bV'V- ["-Lmӱ?yH~B/8 9ߞ\B}#Vg :j|۱O%_ ߋ˓>~ ՟gKtB.ɥKT,=XմR3B)kp!*ǽ&nMhCO>E KmJַ24FV:)\jV.;EZl39hu8Ň_SqZUFvt7 ^ ;3$HJa~@g9F`( hoQ5WB|)ϻ ;znJ~>A7wpYf{\oqcu3_AgYbR-}a KzVD|k&Dx !jM5Y 5r+yح{ ";;Zd Eb`r.\A` x\"6ؑ_o,#RL:ԍd'c";#6)TlqE2$UlE<<2yIB yNds# Ud'H$}٠ɻ"y)<RKAcѣ[i^(1Y"KH>y e!ЃHDB$=|^ =!#R".̃ 0CtduM|l6EIR%92EA6 tT(mI>&9ަRAERGB1<~4K,:R{oˬ iHVcdd,T"R#LrXC$"Cԩ6 e5+yop=|eeeu^6\_K,?X7}KEèv}+DtrK)!?[uD`M$ﺮhm miIr(D02<2MnD42@V6V*[yTS'.ٍ;$В1Xg9y]r `Λ,P1Kǧ:,Zr>lusn.e޺nݼg<@qrZ*H[5-89'.j[ ,=+<|fQ2/]E˼(v]ە;(|"XDUT{ N#zN\,XX2砄N( }.x[T$wJMNvZhkq)񤚚KL{gDk.u/BuNG0ylF&Jr `d\c&QGᣅSJu2G ;X:M|ѐA![j∗q雡cH> g{M(gt~EѓlFH !Psf 77#DWZ8D0j):- `]Er {\xv}tr-Ewtqs)too-z2/5^,u<*/n>;͇{b_paIH+zv&TRއbjq;mk%tL<%HЀl }Y TlUTÆ|WI*vj^h0]={N&è7U ԓ\MFC`xP4Tr.p0BP}]#2 $As LFAHip1sgKs`rKd3CO Ahd0 NePp*&(Q, OfYxdHS6YvLgxxLRθ #gBow9Ф_vZ(/ӿELJz>Ew/ڝk[/[{:|r5u^Ͽzh:~=rIsFBd8u6h׮w߸'dK^%olsy83i[kL ӿJg|AsCx*΀'>Oqui$r&Pz]z??2F9k/p @&]嗜$op~$ |v|d/ö< suv ?|5prd(]X23 gݎsnѝI !mF~p厒 {j/1+ AÇoY{:ڥ.;nǗ] 0uy `;uAxTv}ayg~}[xѣ~h!x}qU/7CǗ߁#o^w#Y:O Ñ(=t'h 錸#Y{O2r9qP|X,J=8/ǐ= Ǿ d=Hׯ=C./}y=kT>bq:=􇗽 4hCa .7t|!c^l_F"z9 .>x8`ZA¹贔*s8DpsB,b(fOSzb"`R7)&"x c +k{YZ錫Ӿ^" T o0;^ku[A`kmч TzsI9Br6=R)%S&=&& \& s$Z - ֪ TV;\Uf[~`8KY*RpV,.ӛmۙlU+\IwBHw[*TwTԃmI"-XG&)Qj\Rf%U;%]BG'fπW1&=1.v:͛䑥'ofR+a wF D kژL%.teH+gG}*(bW YfXtV"XF(i, ]|K+doL, c=r`'Yd>@-KWU;sM[,R[,ʊmʀvFJˈ|/WUw_)ӯ5 G1OiEێ( J;[},IK F_%]~iǔvLYjǔvLYS~_Tjƫ,_;LLon)$4I7C4hxAr  +P*2T,4.Xy4{:$pX3hR,(ԛZ*ץ\o I焀e@xH8z칲m0, l~)eLx^2q4 !K4!!+NH(e-9S`)JBW2f}S3Ë4(KË4(+/*e,]8[hX-~^tePYJ&`r,fH^j`LD)mZ@ w!ETX[=ri_:QR3W"yc߁UmN$ LJ2Z` 22)"Ln^ ?baqdU}(F7*DN0MeI~`5L`i3,(% =3<;h?z/m{e7~"J.^J,<ΰQ۔a+ŷo F‰d]"6?ˀVVHBpR7]_ BwAJέ، YRCSRm_$c^iͲ0tKc/V#ۻ{طYWbY!O%JYWbm[3䂭O%V7XҭĂ /A#̆'vV8JEI>X\3kZ(|R<~ ֪+ƄKzW+ E{ӋR70pΖs!-7O3o'?|nո\ zƅ۔3o߸u'i6Ait/A@Ml8Iƍ4m:5"Cb~$Wnl_,dIpb&+czn~Mփ fo(t8IN{2"μPC*9=pu4j" 6ƔؼA̒Ty}I6nۼnú+d o\u;#O9yʀHmU<*AJ("rLt˃uk?($LS㽥sG4>;c+k))2uG=r@Oq5Ժ+9k#%`i-~Է@ tE jD ){uwLZ3hoHDuI.}BU' %&kOT>Q qIywfC[`ݷؚA f]-:/M)%ה0B`]+unb= K6jo^<+H&3B[f9 $g TTi@;ue\>5Sc.8Oah*$dki i)lt`R>x-,\ xwK׆O Tiҽ%r߭.M,(Y ԁ NѢcX0y)`A!F=. % #^`4R|M `URV],Ut9]Jϳ=GLy@҇$e\$W §aEzJ75NIF cJ2 Hw`+fTԊh`A20xIc+[xemXv^ez`Y=-sȲ2 2gF=ZjY% Ra`fYmQorJuD] (VhTISEMH7ӨI TW ձ:RBXBfUyVhxF,,D>X3h%o}fgNhxJ#f| "`*q@mkEŰ%igtJr ְ(/CZvSAMv_YmvPmvPm=4-U3}=A Еq e tD#2HD^l%AǷ}*ű6?η:#hߺ#36v݋(9]7ȗq`f0(VpcR" 1J=XkS*m @ mԜcbn=VڃQ{0ʃ!?&`ڃQ{0΃qG;kxf}ql"X:#J0u͔f4 -Q΋*z#΋h{ER+2ƀB e jҷLF&R(}Q0"A:PzA+_lY+Uÿ31Hv5J`0ɨLv64$%82a= " :*O"Xed!Z|Tӳ 8Tu{R`1H)2oY]HD@OfV"NA2wRL2#]koǑ+faQUe@X$AF> -IJrߪKQR\˂=SէOGu,".:K=f錖::G>nNwQNTys*@ͻt#)r@ (wtsrTtg; ~N Xs4SM+` ЅmP#0S3qӝSKLVB xkG"ou[W*5#pK0Ty*1 بvHB IS@SS͚D%;ƁРE98ކօν;3& hnh1l%^n \7ڏ3f ;˸4=$$t R9X5xH[=XLZszYD0dGD*s4* O Ӝq!Zk!53"Qe0ƅi( QÓHt2а 36__AW_2#:wmA:Ab?~vzPoNO{vJ~3xf$+'/ί_<֞Dd1xoÙtXI~NO.w\\H[jx{{4FڋފD/}F]lkb{&'rޞf|Iy~ SףN?ye R)%Dp`R3Lwa0(t5K+Ш5dSx{2[VQn,|^o% qa½j6EdQlu9ɷ C42F'ME L۔|\jK!KƦ*AXk[. G2w/7K"(Tg`)CCyY>lzEc{:CF">pWACk4L=qHF>vtM{A!K ":z=ӪicH^nk8iѺŘtg*/F3jB10C:˓XAAW9d?+iT !#t qLbdkf+;6/Q'ҟX&<_22K8h @,p&HrÈ=CDa&,%d{va)}U#bp?wQgHFqi存e;Jq6샸T˂H9.% yogGpFkY-D MKi3 6*Djq)*w\.S䄞ĥXD8Y+t%l̀&7cdoD5̀D3pt;s#sq #\qp0dEj3K a|$dz //13юdTϐUacԦg(,#Z"#E[(A7jl%X1uե.JQ(B5یGea&S1p8yi6&;( N~Q7PV5Aɨ $)`7ӂ (?Fu3)C")g,΁g{FD(yF/c0 7*kJI|2Fۜ5s# vg&z_$k)Z\".\=(zW% 9`t槸R?sN.P}ve`'hk2sdl8a#ՊVF2>^nZ#qݳ,Sx(3'bX  oY*Qm0D{Jaf'ԼbhLl檥 27S=&%MEy7ε, }'pt씓@$0]@3-\$E@!QJէ©"OG0 ͵gE7o>򙨳&6>0of7G.<ꖻH|I{rGA -w^Ӟ8|ȹhB0(i90ib#$O$?Sm34k>u.#iaH:*qAwǀd7 &9}i C1#G;ڦ(Êu8.qha1)vYnM2B@'iǖQPK<,@܋ fRFds˅ѥKq ;6Ac$ݩ~7kpivA/>D0NDBYdȣ\)㑃 XqA0{fλ$ @BdH) ̰[nt$֗{ap9Z1n=RӐ|gYaZuc4]ccqmvA{4RhiNd ̂VR~JIAO.{\4 U{Ac*5|\؎:ՋM"isFR|yc-0YΉq~l ,>$y Z}v4#g-4=]-.D'A'Rѻ:ӣq•!U(?^^[0IZoUgnToOO޴%m/Z~[ ZeMHZȡ% n7==Wc\{p]ĔBWC_<]>_;^xs[ξn7߼>[Z{6e wouHSy=Pl&)rq'A>{rҋ@oҫPt5]+_fMJuq ;Nv ~[kGfq1ND{,oW|v^ ~?OnpLD_m;K$ YoE/<aNvdCoI`\{N}eBˡ2shhْ߯%q2Ù3Ffk3d0E HN>L*X$_,*؟ap}d3D:K@_JävOzxG~jcQg&SЊSpz.{G4VÃ<b|tQ(czƑ_i*`à Y. :(uYju*3L*E*J3q>*f=o_Z8DP:*߼%hLP=Ϝwa{Aܨ1T{>/hw V~b0w[2wOs~3a#\"c QY fe]岬sBk(m?'4㝼IcpG꣎3J5.ݫpgUPiwNOr9,;F;|=o29Z;C<|\35DfypL;z ޶J?G q1zc[r_4m^L"D)lٖ/gpg[DY n[ Aٖ`3޻ևs+!" A0REV}^Oz|zpW/tu_jՁEQ=!x^H|rNXEÚy+#h%sB`Er*n<?l=~ѡJͽ?绺ɮ ͗Ǐ?T73sL EYeiy__߶e4޾Fr7w(? R HTYg999u>UF <>!ym׿tiV{1[=OĔYhw%o4m}i/ɀ}{ ]6+r5P5nF'k޷fU|N [ݭ# Y"1$sF h`(pb@1`k1U篯kv>zY^ +XCYZꃟn(2[263beZRgk$PC&T[t/u׋(>Qqɿ1ЙSt♄JUUYf#ۢ/BGw:?ix $+%K(Jauf̕ϧˆb fqKh -G(NgGED}-kQzDj3@KXݸKR,xw;x/l9:gc:ԑe.P('RݳwHF:{FOBajkgA\O" )k&ɩ# nJwt$I7+z?lS9| /Vgzʯ_uz)p;ZL})ib(ܯ:5ݫ"s.&(1aYqI6ϪA2fvGE(Ո 8_*1ykYgd.j3Ppuvxo#8C:xzOB ǪkΊ fןA\sCBTBx煻x)FHpk:yhMߍ7@8h@6h?'7њ l5 Ȇ2ѾMui*5quUT1%UU:V/~\o'^}P},F9M(jsnj[̱d1lJ[䥐dyJK:j&EQEʔu9HF%ؘ ;/:qit\<} G|ɜU ,gFg UhV y6ڞQHBN8&v蜨aEAt1oBHh>;I9RhnzB09݄F>4wfY[  -> h=K 㚦-1w9Ɯ,{cj7~RKp/}p/,y(9Z%KwF[rmYE Ee@RaQMeYar<)kB&e ]|k`6 6?a[VkخǮ2}G\g{/%]]緟|//gRAgh:#=6ؒOWN9{;hĐSV;ڃFst݉=H\8>YYjLmCYmCTE \8e@0b͞ f9ԶAmVHTYm MR(k߁_Y܏~$:f4S1p)'G$ʮnCDTw*r OYRfbclě̷pJȘzaE>C6Yuꩨ^?SRH2fkN@>蹒aF$̯{'d<ӯ<=Cxc$'J[,ó bܣ^{KzQ/[Z̒Q 0Ǣ*u4XV)#lєiVd^c^e-Ѕb ]m,;nSZ|=h@%wd)sKýce.\3q,K`.}͒/Qv܉RKm/z/@(_ilh!SDڑT/aW u캘Eχ!o!/=-vfr$c7Zhtl ?~,q39E/|۔_߿Ba:IQFhÛ$HHu.f{+H-qNUmKLZ;1qr+rl=><~cxKP][m0hJ.P;Z ǂ3䵿q~jccY:Qu#.^g-N3"ǩBL*N)z$ߢQJRFj$$w*FB4}w:lI'yh>=s+vMv&z!͜B;= i>e2U% gjȫ5 yOy~zeI14Ҙ~TG K@D%pfhLd?uq&& ?E愈:֘%f-G%?9`xǖ*iES eC0-NU*yG}`sZ7Н9a"~+~4tnOEBM`m'US~.kD]fJu|וX'MM%c4g$yF ;3#*߹QO_`uO,na=RO_*lyyVU~E!&SbT돖Jʦ߳;%)Tҽ[7`􉙋݆B)=H@0~"h\"Cа:̉)\2zYdzVek;S )f&5CskւaYp{v^/}7zJ>5++ JsSi*eY) Jb}ʭ^}ʯaۢ#HSK-/hit oGVАy3oY_4Qm¦ޝ_Ctbd}vnKXJywS8U\eeL/%ӈ×̟@9HC06+Yi2 2]Rl/]EԈ<5S: Xϋ m,HAL=}V\HnNW35EtlxY]0/B&IV 4V7޽"EX,>~}3{ApޛqqZ"#2}&{{I$qGt/iQQϟ$~AVHb.2cJ@kgh1Bx]UJgz{ݛ(6칐es"ݘ'+΄4%$`2^S/"НLSg=IN{C}-\tU MxD2L_1[s^ &{)z 7 uC RL3`ҳ6vf xrf="L!1}6TOj >jFmnQ\r7{k!>~$ X%ժQdy߅9J%{<JO,=}a"HqǶWT7 TӸ*~ojFNwyAI / R/CmbFoAwV \0&Kz(> garv%F۩ NqWWK6#%iPlxg&hx6]5Q^Q /y'&LZƖUKe3Թ@Es2TPu! XE!P!K^?C2oa!!)e>'}c E Xp_So=G.w Qjso[EP8nwq6|\'쯷-WT%l{~c#I$Br'(!2n @/J1ND~2k!J  HAf.9Y)[5NǷ^Fˎ>~'n;UUax {΃PDָz-%y߽l8|lNSWp*^=aDKwUań]ْOGŐF u.1F"jsNY?cQ^M^ǕR)a # Y;Zd-1WaU Yb*N&]6Ɔt&4݂r0Z ( SxM 5/{Fܤ.Xb|ۨKJ3 @$8q#2,Z護ٟ/RC64P<@d0W!iG^c/YI˔ iSFk'ݐoo ?Lyi}>t.|_gV< @Bk3`9`]km^:_KAݧ92De_)G!l۝s5{`  *,P4b3ypYdˉpyKi)%4kuްfNQd* %@2h2+ew MqSMޠnm$i q-,xxLb-Qxxy}\ !^iu]Bzb"snՒ `-S)y$(EFJ Y^{x4?Y;e7w½;-Md&޵߫@MœV+fSB})GTBInER{^%ܵ5V\/aVArKfm?:sS?:ѶL 'ȭ<1~lrmir|FΦzT`ުb!8wkȆfJ֝/vKSv OnNNʧdfӘxέz$R!3V&sVr'(=.‹52y˔%B;!QI\""t繴1H{zj=oC/feH4g; ?Lk]Ii1" ܛn*7 u d?S|λ*/2]X[sh8:n&=j'4\064`tI̫ i&QPi'K <]^u}kEde7W*r!?:ɚˤJ~Qヮs"e77uoNiZj \=\B(3_y짓W/_8<3i]W W\ 蠓voLr|u×}vx/o} WL FW/ܻڠ8mɳ4 (.Qxp)c#Cud`!ĞtZW>ien7r D`l< wE~:0чg'P$E]l2H"|rL̐襑nA_ge&Czsnv[щ*̃^PS-vS|[:C\OݼO7V!cdvyh :N3WUܽrvS .r0y֋^eG0Vaay-:,'ںF~}$ixyFȭ@GcI}EDl%JXAKy$NÔbg$ @q!;hpA<}'x–.n,GP8HaBX ҌFa2R%QugUr\]dRI~c5GZ1G:<>^!BaL&J?|eIWW=whY fD(7#dY*+Zc JtK$61G>Jr6 G!3ҭE@q :F9*\F<ѵC`j_{q>gIs<saRPotО95MeB<>isx >!~ [YC +2߁)aD3DQ,;)rbiSN@~K>\`L65܃猦U3E{ ~*)Ep\o˷<Х0`ihSȦc偈)F_XUuD1ct|MC$, 8pF! WMp}ׄ&\5Z=2H*WGkYpXcTaO|DeLXqDHQ#\ ._?2W}>^Apihm ژ>xk C:"5KR%]e)mzS ~9pnG%0 uӪ&;&L>d=D'0?偿X`(cySʚ<"rk(05@MVN`j.gXG\HDrE J`IARa5m bɇb٣35s{?:Uo=Ze>٭zTڞZK)K6Μ\}J:6gj3TNg:q{y)dtgQKjsYHa0Ұlϵ4JsIQDmU8V{6-]keK;ź쇍n *iOxfʉ㥕v6HK%q[$2R> _MK؂';ϟT;b0,$,a&l 3g5&Rxf69C5!ĨkbkIΧsvI zJ IQϱNkd7QaEɜsm">;:>_ĻRBC_&7NO~qɟ+WI˳5)ji%[Fk#6l_p_}ɞWEҡzeڤae=v=ƕ髭vjɥONw}1[.57g8s^4~>5d|.&YX0|e5wV9oS w+B^#"< fGhlslu F>7M?Ytɤ}yU0J3FJ x+-VaNJSa`XtuMTcIA/dK> bOkb_ľ&5}Mkb b/\ľ&5IgPzkc d.gqD aP<Ƃ1GPdP+6>-g@LȭY2Se}Э%3U^*-4Bc%3SL+xN[r ,PP+4n4R XH 5/ 7nbj@u I_3`3c5ukSTN9;[|sNb#*(qV[011eҖ3vP`'1LyPd&H9oI[)Od4Gz5|Ad=o;J^H+"=I0V^S mLј_0y᭔KP1B (Fʘ{XĈdV )z$Чb@ç;9/AeV#uCE%yUh5i6MWJ4ީ]:T (tuڮmr:ci"R *GaS4ݠl)0eR..َ̬^N4Pz.;y{BG+v wEJk>;ql({Fgy}(NӤ,b+T8u@xJgc  C cBP˔=Q`+eZ-qƷ|0erFFz&=pq^rPAM ڪXxlA;Ut 2fcÀgQ/piq'X)V!4BGhK9M%1~+H%WGZK&Z2]$xm-3݇VAƒsUzZE] VQ/yII+#n!*J㲤V"ʢJvNӡE4Fm0M*u*y Ø P iJl`ll4N$&.1076.Q.Q.Q.QT^_JkB 6MM͍(4Qh:tF妳R ]YJsR̤7p1FGPxLD5!3 g9="*OQHuTXf'$,V1՜h9#vB\ڄ8APt1'JKsX<ZJszZaۣۣۣۣ2oe-c{*lS"2f(2@K06" PF*hqgt^- 4Ov޵q$З~ wuOY[R(ʛU4"dg"mFd_Uף>bhb+t(}z%vz/gSm|BQ[l0~6gл}{Wƹ } 2ʠ(n烼Fg?L>O7.a"zx;GG?l)/Nqi/>"$!ʘhog\Mn?l Jc2 cʹ0z1^b++Wx]A i @zznӠ";G!SB tl 5k< ԷxP{ 90+2:OڗQm2{}+t .M⨚O׵NX` r Š hgF26G;I$ c".i@ 7c٤c O(Yhj-ƠsYx#ܠPj8fCP j|fp5^+:T0۩yoh4Mș1 2pGgCg~/Ndo8K^jܔ&=CǂY鉈E9DP$^WKq`Ӛ>lh>0@z6K*%tI a$I@G/V:0Z8-R; ?=Rg;HDȓЈJDE5@ŤN΢VU#2;P:hf5,FPQUĀ@ M"9CY,]QB LK`ɒv'}4$')QH "!Cxɫdf)Wٕްwn ^{8 p iHřbR mp~PUYA>cŠ xEQ! W؊1*@qx<3VaԌ1wIjרF -y?RX8-H[fipZ)athq!bDRǤC-lѡUS}Y9F:C.\q NR8YN[J0@Y,-# K<"f&`dt4trِNBo<`cNaSlyJQ w`lx' 8ZbQ3B7߼Od(AGH`$[eQ 5xE5"t6]Қi8:LmpKe&C\҃$:ii_wKQni6o Ru `C2$ A4EަA99eUDȉL37ɔB%c4BzkTr"^0B70*G׆ĈF%A P J*ǭBɉ4z-I DjAtH`Y>Q!t10K%AB'0ڜڀkhe2Ta~RDX p6(IT^\/%|W|EwA48$Fh 1eW]A"=l8Z" BZRJ$,9),LEgP99T0 !pZ?ój(ޘJڜR_jlpt@?3RGqR<9@9A#hHP!v;KQ4=eM*3~(I(i#lcGa t I{*ii镄kDIsE[%Tt5cB9 פ=4(5ƴHޒ&F8NJM&VN$rZ[ KG>᝽uj<ir£F"qoHa`l4>5Z!{tʭoA/g= edÉ<ߟfjj5ÆS &1o 4)uI$0fTa6q<TIވl|\VAIZ~"AaBׂ|B&#ʜRGs,a[֌J4et-޲6 ?YaM&Spj6VyRno޴^{ h4;k8Y뿔.rFI:g,3se_twfшJ#/..D2{4h#ITW'?).4 ,[i@:իU&0t} pԎk4o ϳEmkLQ&]|?\Ϊ\"Err/I%r\ҲAȒiU 6jKƟ_?gʀ%uU]^e{bPQ܈TQJ.C1U߳fdF c25\tZn"Wj+e OQ1Ea_; {g>3>3pTn]ܴ/&Eڬijo۾-yw'Yn h:WλۑglUےVwCf9ϫ?~g?4wl>;GEC_Z?x*%gg7VyV=79?[ZFZ?ɭ+uw[1 Jq)B(1!$S}#ڮ$kbn>i0NT[w ^jGûF{ҍxr5ͧ.2Q|O=fTӼUZh Q[y^(G+4XiHDhZfbJib['dk|xZ̐ϘW R"le(!՗My-bɶ aJu,_!A;>[\z,d~]f1_ >&s}]Ի?OjWiL|F= gg;M}~,jPN ɭ{҅^>8 N\uϐi96ǴŚ:clEOLT]I&Gr5Q2F}m=k ohh%kkoUoA@vZk=pC7j6"nD XZ=ex׫ՙ'x~r zߣ;|$;6>-+Wnbsi930IWJȶ Ss )g}ݩl\O|A4m}ί1i)O)˜3/se=s<6h%8GP@N`8 *yH8|MfRD/V]'_߼*8s_S/j0_xѹݾ+JG-.k[KJo"U~lWjgDw[NaYdDkmq zrRs۵U5  `zQR3Ccuw_?u"΀Itk]a_i9N'6j.sQ% )# $L.Zù #%o{s ]C^YIW.?x"HB_%/+!tB@ Z>hD҉JfjL NKƷsP60,_&=> ;ua%?@eZ'ՠ"].[*ܠǭC'su;) AY[O+y%>L'[kSF3bT~hQWCVH]@4Y9g |^wu;VD MUjy"ף8l%m <BayPV!>~^7U?N`:<@n|kvRB=ũ//Z"Epp*E݉6c?^eb~U-,epAvIZp_=~ *vjEU 8JX<.WE o!<{/6:W9Cq~a_[r"Ή9P}>Ʃ>`߯& CU vp4f p٩cL62r9eZWj(5?U t߄(.xxzYղUådUGWZ;~_txTԍ!(ϖΑJӽ86 +[𢴼Ve[IUyqLҌWlyڲ^ׇKTl{ :(=z1[O{ehs9Y,PDYŋvĽXN"_~)*&b9/o8QB>8]HS`ϩ年÷xX#-  )TGAd%'Uiޫ/N[wORʹN{ @`++9d0sHG!1\0gQԝ!po=IPBOS b3hC""-s%^]QVꉠASлNh%Ӵ hyO̒'9e-" b`<%$̸mٻFr#WzP3nLw/PptkFIuێNYE ŪgTD%2hj r=W;¢Kxnᅣ5C1jNT++Fo0I$P.8W*=tpz'%B2p=6J36R˞m5ߘ0eπdBvaj^tjWoW=='XH_ܢ|&ӎlxMϽyŻv~aR'}߱iD 4qVRGd1W}ԥd ur$@RCT2I-NS\ /{ Г) S2|c8G'/bAH8^G!$eC]- glf!4IzIX^^ 7) GA}g>PPND|+4j_@05 MH%-3z;x%J 2`G{f6YzͺuNkŁ#kJH!P}3_7aSAbO0J&nd,ꉤU`rdm]Lj~ljϯ?ݢ?,-̯Y^u^2Pڝ{$~. .EO¨&|3R KO@n9x;!ԉ:^+98FOYNu1nT J#m5W' $R"s+2DT.+ڈJ072줸eU]tZZ$udic:!le,f4%hޝ{:Ҩ!ǟiȵt^-eɘБr[PA \C,`lܧuюxKPr6OtՋv3z%\FĊT3t5o-lHĤ(}%0!4ע1?D0}{5a2)2/x`2m/uQ:(tVxǺA f!u]&c>P?̳][ZtD9f/`dɈ:=q\ZtʰQ_) c}Yr5t J(xQSJwboA թYeP@jSBaPAO`LaXj?GZ2ſ͙.4U.4e=MSA6uaV 7nxl'c$*SC;2x .-:2gy0K p"qQ;z 3*EW)JU=_ iBKʛI)Sц+i㥤4Y);0E.}~LSj@j(V~F!l'9N|;{whg}o27lg.o>]]0i3*&W;}k;XXtl֬,rn}sc1Ǚ0 J}l؛PG Q/_֤~F.}']޳19EPrhKVY&k;4Z ]j?vٟ?4hE.[;l_$ HR#]xx\s\8=h]F.(Jto F?)":"n1ӡ0Qt=A jל[љtᆿb=q>#?e%8}AI8xlVܕ/O_4/)Y"ن;YQ^g\%_%`$UcA"mgl) l;葀!Sl_E=?oj@}0>03T~`<`@  َ& Vh'~gN!ŎwD OY7;],o;:GԗJT{05jÚ#[.eqwWPt|Lf 86jmy~Y&X1mnt3Q'!Q; Իs&aV??V/~9IiӸ- Fu谎BkCAAêQШ?zґ,lJ4 ^Z3LFTd rwY0AX%Jia0΍p\IJƬk-\IqjE*Dzt7޷!@1qJ8i:P:/5eiaZ0S m4!S2X:Q oCO={Kզv6y,  8}.dQKJw$s)ѱ. u  ~; v2`=L!4䍫h2hnh9vffD7s{10}[)EEEWQ~"RJ^non+Rd'"swެ'XTj<+G#\~TbmXkEB3 8,y;o!݉rMQx,Xo>O"V޸EDb$kpR6Hֵ ǝ6-l*ũh$5!5kM,!E!`+47e[ޚˌP@ն㜷<hd)ЈA_ՍQ1KB;qB2ZOX;g e\AkLAEqHjCZڠ'-ͻ cX^q938G>iԓSt%) %<}ֻ$$q6ջ.X9]~|0g#%C_;F-خ% v RDd~zf<%t Lta5E*PzFRsLXcp#Vx8,8o5&#Z@)b8ym'L^Z& 7U3#4Owv#ȵ7;DU?fNT]jxwqSIs\ >jWiS-9ghEc{/$U,eJQI G)"F/} Ø) 8fb s!>: qϡm$ Mt>J"ّWj@7W jL6`S;ç߉'wmn]bg7՗iY(FwoK2'Sspa,Vxl9$ "kl۴jwPZٰA]6,`X*alˆap)p36p3qarՐQ(K Zp#Wh!l(օӸ ,j2srJ⣕23wWXڰ^BHӴdQˢRRLڈ%LI`YT8gQaOYҗ`Jce4L!|}{"ݧ`o?5I_h>ۉ`l,nHJ=)mvix Wg<&>pT !4I 7] pԕQu[.1(8)=5ΕZ[ǝjTx'zTslKd'8Aڞǖ(A;ًy.HA$TJ0mv丛㩩f3h9$dSEE NPIhr,QT3-6(g[#kF7/-䑛<@  G6/mGa>RۍTPZ'{ HƩ}۩ X6Ě"hn3)YQ|E_R'caɖܸFx7&J}t_֣y٘h~r0j/7[xInjbQÒ(Ff"R2/\.XNw-UNPMx*+06^ )&՚T_&v$c֌]$>G[GјczWm,߱Ɖ(  5O@R-6ƪ!Xl%K)ڑVA96g]971La5ᚣ݀wTKsڅ"IyęO\=`a6XtUJwU(taJ !5>wj"sdPVVNu2FJ×&&89~&[iz"c=g?Bp~zzdbR?;Rw aoNYh\w~^Q(Leya56C?%44A\b<{ _VkLc4VkLc4Xl}UԚ24011$Z׆ Y5BZ!F9Jy}\\-MU~2tV!?>TY Df,/~|hxRWv|X˧1_>i̗O|y*6H OHל *9 +W~6O4GQP|.~ӄppYgŒ 5>J;!^?-i 1{ߩ1[UJ@d g% 4(4,B J\g]BEzW2bN,_1#io8 7xit np*ԡOxJ/e``S%lƎ2 :KRBMturijt".]` Ԣ9Ekm\;BP?ũs$cg,8<=UKa+vLlGi2v"׾RD]FHg} /DrJYևVTx `ރ34Q3f6 9(CxҔEKK)Z$JVUE$LH;DY#][&Fb3ìg.j 8P,{Du@dZ0qƠt)pm8BKҐtTkjFz:l -K87tRLCb4X Ą0ѬSx ]TV@qJm/461K:Nupuw嬋})reͭ2({݈zl&W>v3o瞖,6U RŤGJq#4 `fOktx]_} &kK,ls 5"1[{Č;.='CtAP`8p-_\qA{XLJ7 h?q48M&6+k-FsOBFp|{3NkCK $Nk)v#"3NkGMʹ+E}Y1KGJݍdS t#=@L}2qtOqdFVnl6b/36Pp`XlS:n{BQ$/%%bin^i߶<)ѴG}oB |<#$9N B9wEѼEѬwM݀ൡd? b1·AӃ3s;q4 bNao4؆s6 c$@F$9G+IA9ĵvP?{ZÃ;(mX9(le rAv`G7;B{Guz#BK:dV%'Zfyd;q >蘺H¥>2cr}?|dÑ+)vUH+A5,x>6DJAɾў}`u|i 䔏ؤ +0D>~CډeTCdIG ՉbIk#f YӃv0(❇ȃ-ŝe8Ɲ\ݘ: 0l}2q6q -/_?;/-/yo:EɥXv^Ï?fWwC B wɮwWgze⨐QGBe TȢǁ4TTE)(g[¤JيKXi{'7p5V\vtܥYRE8ă0*I+-VSNՎy#%_rI"[v-3IbpIde'Ȝ5w(vd#K[{1{8 J&11m`d;c: Њn^Ͼ+BNqC޸R;/=;F5x[!3d2il P L-X:5KsV}m/3g֧ di^|-J#]n96x )BF31[#2e۸ϻD!@ .,?}E/"բ~vH 8:s"5#L,$4ßv+Ɛʇ8E.īImZN 5v`L[mrTcI Me]C,ZG \F !) .kM2 B$TZiÞ(kBZsHI]`֞p6&2&@H׋[t>?퐼+TN]ul?&k::F|N"?Ɗ$鰷~%}>]-],v$hw{Tt;qgFnX<7W"qѕVZ9Xk,=}L\嶺=D4)`q?WfA_h{8$Mm:[Qt|Ǭ߶K©MFl;[\T.H7٧jFK|*=z|t+r.{#^Su1UHߥO37o vLreJ!~w- rhcQ\AO\pWFӋ+ydWn؀Օ|2,Qõ&9nL<\[$-GǬ9C(Ώ= 8ݩ\kCՊRE,?/> /s &Hau29R R*Ѡ2RxwёIiw1J#| N(s^iǃJo5>鸰yb+fB%^*4 ~xiӡHwRb1YEBk1> ؏"kG-WLI3a 2FXl ZJdQ+K>5WJ&9./ƹ;rswGC!Sf-{ҵ" ZNnJFyKSy 9`kci4_no9Q:)j]cLqL0"ki/ !a1Aȇj,t/9GkSjA%'[}冢# ^w@X  L-q*҉=i?>2%ht|gN/ª?bs&Fn@4=7|fjifjiZͲxCx}_)CD#lSP5k-Q"MkX1<8D /їɻ&bv+]K' ]NbƏ C Gn~Ԭ9yE?_j_Cf+Prֶ3*!2X,NHl/(HbI ;` ^R D^shE .9dC% |S#ƵR , ΝYcE$7Dar:Z':齜H?kO͵}XmY%kJ:Fx^XRP )*hA%#.\{==1#sVJ uǓ7˒Zva>(jXc$q.rȶiWd#Z0{1z]7zpfnbeavt>3X/"Vr͚y?6@tW?>}pK zP5B?`^J9A9BK{4\.Y6FB sov9FsD2ǭrYgÃ;mX(phy(׊$dog>Fs.Y$lAwk+WY jLA ҶU G1\clΟ\=1#VΚo\@O*$U7;9awQC eHheq{anӖZ 5qz  m}Дw?ie=<&Mtc XbFW: T䈙76$qmAވ$fb^Ho" 0N%W BՂVVP1JQm+&A]Oa9WԨ&XjegƵ ;CG:*ejZW."A~fi}M17[JC`E(hKFbxme/\{_pUdKT%uS4)rx]sUm* ~%g 6EB [Pc汋-$q5Z9i6%9Zݭ?xn8J^PND_%PM.}l^.ٹDei8UuWUwW;_tM';MaXq Hm-T3mmplCs%FeϩD6ڞ> >QfURg/;OVa0vp_'AW[<ÀbM+ѷ~S5dJ_ٴjO ѯjYѴigEښn%3<&|Z6nᕃ ζQRXcl iޢc̓7[= ]̪+;T~˝1 Q=Nk7"i'_Q㬦I,bX^λcVIh"aqӮ/GJwo3e8 k\57$YmG8z˨OPXoJ2*(϶O|C$}1*8RgѰh%R$ϳd^‚6QW/1@ q=vp5}I^3Zy9}V/WcؿL]d`X/EOh=Ύ9U@q)2x זI92=΋ƛWʅ/}I-~B բPś=Xsn+5 %Ӭ;?Y[pQ|}՗O8d2B[vHCbiK9R+$g61,XAJH/r:f8vc\=' [r$y%Vψ`<tVY1Ogm&2ɩD[ qI<8% <=UR RўM[}?ԋ`7CT(=mrF;]ibdr, t.[gf`^Ĭ@sYkèZlƼ2Bp'˚y*8 %72@ ul~iG4= mXܞP~ ۢslA(+! j,ugab1b]=X0fTx'8.#(d-8%JS^h^a`k&fDq a5d&,[u}U@t[p_TdGߟ{TVg&WOnCL+>Ci}R^"Cȉ$e)g3ס֖`lRBGE%c 0l¨V,w{z,w7OO.*% -<5{Ϥ}ږN ,{q yTX4c8;*F @&9яG%!;)pˮNɖ#LM^,l`y}1r:Uv#LJc}@#+NJ jz@n [CSvyٌftslE{Rp%Taဉ7KPvJi+;7cQϝ k4Ӻ't㬁Uקkݭ|sAO')i6?ދPR%G5ʙSVNmG"-q]DUqUڷFXqUBa5'2oq#!ND>%-Dd$ `l.D@ߦUdH;[$dr.~߬ga4ߴ0-:Q}6o9pFTe/M t o"+&%,iu6`ٝwAيToN/V,*7{pɩjYFV}\as8rya"'Y}dbߜ"kr]4jɛɛ*cZI_}HTCrrRh 4MjDdJF:_bV#bY O \1v6ƁHavRI NX= ץ/MH缹JT*KߟHqb/Q4$c9׈;|+3'9mRJ3,($؜4r"). db9$FqkGPGT g"2mGpyj?o Έ2U9+YqUΊrvUڔi=A|\+"gF>2#evGpoZiT5eZLuY &F5k sT_2E)J*bcA୓bXҶ@MFcʞ̥mK~ p܂tӢ7=T J+ixke:=ȬsPް䟭АE{'-K N/<p-ߒp,U2IZXʂ'*0p#J-ؘwgo29d>F(67`d]6jVg%slyI_.GM~$^`K[ ?#ۮgoۿ>p? [F-,u=!Wo)_OoNigS z5y7!+Noߟ Zsv .L0x'  @Y<=9wW[߇"~eI@[:[.IEOO~w3Ll}7-|}$Z)z[y?Vtw* C4+Lhܳ&V=kqzFq>0`pk/(H4ʆ ,xyCm}te(dR MP]]/54Da(\VIXsc5nu:c6vi,ۇoohhX 7mJ7b*]p^ 7WV2 A*=o Q<` &ո9R  ÑPt\'r5w9PeDI)tlPdpL]FQBSM2^2`LEvfB Qi4.H"d(Э+cCډB]QHQmYXc"Eafޝ/S7kMI)[p"u,~ޜ&vN'kSqVLY1gTMŃbԑ%Ng/5с &\ڋn(.0!O{~ qϏHfAw۱i"wq mډS=գFyQ:Kx%5x<9mϧa~Ió̢3a q[c#zjк="ye蹓3l`n2eO^Tb:Y ̽MNəi7zMr| [m !6FVtd{qd7OO.쾃DTU uck ?i2*8 guzX9*`cQ0xж=Gd]F|$x#])܆*TdTUk L祽*`K:XvUWU/kBjkH ]Jձ%^$v\wx)-R9OJ29@&%A* -.6s :0sI\2MNʄZeyhc&5'`xt֊3H L)I{6FN&N!碯[:B8&:#qiq)4MH.GFG)ʥ~ZѪ(]gW>~^t,]ϫO-4nMHt&~{y:LAՅ ݍ'oN\>#]V-Me:r^9rymze^}H|N_q'pqUuw@tʸ*e+%ȶQ{*op]UT-m| Dso^rgM@RX2&&d` ꧲KW$dNCYZ% rql;D\dA<ҋy6 N2fDKVSPrN)PGgsU2dLF$ lɊ{Ii JP9Jq}V/ y@hJЕԘ6j}v%7סt5;_"6K%Rk>PSꓻ6,nT^O#[3l; 9zv~w8 OCFZ >Mep#YA/=)916ɂ諢DmQ[#J=Й H9]Uu]:l!ܕv Cp0MOmQ)q 3cHOvU{MndCf_X"j:O5 tv,y鳂ɁU֖INI=>-}~y8yFiCcs}ڜA}o| . paI=l([^UO}54!Svt=i6u7#:lns; m@T}{'?dn:$$ 璞qYeןWoΥ: no2 hoOD(1̅me?YyeeWa^ kSNȤM  3j0e@r:Ą{C l~‰N! R"(qWUgqBzP= TNw:G3֤2kMVIrg3% \.nj2,\/_D_8Ap&w#0UDPOxcrR{Q }DBnX?adB(;=ʆ`>о.&=9ws0s NP&3oSuѪ#(7>RG1.gi>I.EQ5eO'wȁu]X#_$ BӚ.cmp5Mvq@ViP3Y7-LV3M%z$Y ɋhuhNxbNmT4[.F낎M:9ST'g|C@e'ϻWP={E5gĮ@E4T5|Q `)6E{r:vgS/=U s1Y2<2-w׽?> Sa-{ްMdN5PIU'=FEP:LW^3 nsk=xz)%i=%3ḛ~FG1^f e9`+W:W.gp@W ?DX 8}<@uъ,ǭ_D#3+=nmj-{:?]Uܧ~P%D!{p4ѴvrIظ axwgC+.̀1)&45ޔˉzcͽ(XZ]z 55CODVJ]!)9mH\;k'ھjQ$[=72Tk^j[wե̡fxRy[dOrO8_/]m:z%iOuj=Ifn 籓zFÝ?ӾP}.L' E/|&|S+OSKfɽ3D˩˸QN/ YkC)z4(։p+N0@DAD$XL9QN;˂l_Yb(NSEh% {+1 k: _Kx?$9f8)h.2 m iten~qMr_TcWδ;.J+~ Y)l3;L o/p^ZLnj-.@brvp"T N&  dx5q웓7u(U&7+{י[t:aDhC5u@ 9SDH J_3ToCB¬[#u9!=5穎̼#3̫:2zzιȁ[[Y#|ȍ7ʁ%0h2<:dw)kN 6 XJ xxN 'E992yё)Bz;7}Yx71j gT. 3t_x"SlNC֠!ڛo}\l:M任xѯسm-GAG(߆Dx5ۖUVòp??RWaPe+>;!!8[-ޅo.{ X Q:E19j+8{8H` 6􆂒HC4+ X &OV*6[̆J^99伺W7du^33 ,Yɹ<=I3F#V@ʐ(s҉ΙK'?X n z˟J/d TET8A9`E +\GRLK$ki`˟V@Y[,Y{TE(J~(xTzS$*Pַ+V]-wWW8N4HFe<@fSB ho=6⾷h WF14%iĂ.u\BJ-U(9Hz H496%nqBhyPL/KUa<y rZ)*f=d@;24IGt]i R$ՠУ4$!` HDUI}Th.:Y`P _(&N PSHTJBB-O51[STq{7<zA1':KS"yI4hAR㽗)H2Y! èuq9yN8ZBD 8q>EB{(rP+=KKpm6</FoolUᨕTJ;7Wp4nk8wO_xÿ^ˇO|]6Wo#=!LP|^<+Lr_yY>2ÿg|o7vy?f-7b+V 0go_r.(Pz:.8n6Fqt#5c.PER֐f8y}_#N&k4 ]5z뽿Tmh|ItݼvŁ#}Dsd+)/եȫ@2p9$Ʒ+%R7P|#-*@^ n~K'Y,x0)OWwL00deXY忳hJYbѠQ ыEE1ǟwaWWΙc͞$HăcMEUybF:NI(zM|d:?+M s>x7>vPgCO=}V>X0}5P6|Lć:# TVkrZpŰ>O5"/)鵄<5[ Sjx5E'[YwCO>53=@h|fę饞l""S̕ I$oH IyiD0BYgQP3r")7љ BSp>0hGd"wh@ǝ6YK'9Sޅ 'fKN m`St6u޺$h*f@J[ Ц0ܚAqx*$⏢5ҙ$G.UEQ2t#JYE(a |R+${yL7p (y6(R-[1[rZOl NlJ1 v^h˜^7f '~I[N9^󀂡!o’$D!4Ѧ*n4B 2v伵@:l&ko,nR(ΐSeW" 43;/6{ū+Yy)wHrC+T x\T2Ufڧ7#ٻ1yX+aABZ mKDGZAU&.{:߈h]v݅HL؅!azN;+-}(1H 4,uYYOI9ߨ%M 4 rN /?C l09ߨA7r*K{]>9/f sP' =T2oVots6ogbɻ'@]-gח5V-!beve-BՏQ5|2ɝр'px{J7 C`{GJ& 9bMBI2$8R 04?R2 OR@NJqHw\YQX_t~v,gKjb)59>fgC1S!ӰR 8)U!sF!F%0&ݺwf[yfyi3 #')0,괶?f58뇹smqK9&D#=x,:Z`apHSe*hDsd*^Yo3$obl^P{|76c*˕00n!.\m?E|h6 9尰y:֡Gv |}imJ We1"vvmB-Y(|GVϴ@s E1;B *UDGa!(8/hnG7鰏[T+ d8ED*z[jXoN$$<,1&pgZ bY%wkֈ(Skm/:pHpZ"|In#$9$6^(M$s4hů0E$P?HDRE "! 8H'WOa  )=AB*%K#$H {bgr?\\T BD\hȫm(4׫1@ x:G5Űxfժ6p1b(:-Y ^BOqWS5] xaKO_>+ez6 R$ bPd IQK])EvYX4cQ= /jJ*leU~UsQ-ue" a(TԌT4ZP ]Q'y=V]>w >20.tAaH4Εm-6;KRH!BƔ9M*Sb˪/.z|]G}y~(yL˲X0w=֦@B }-I!F^u" .gT7 j#~,ӪK)L:b:jqYq{c~?Ab8}0bD|uZ%j(J4`254@kC(PVY;d!Z̦eXcDc]ujR++/5wj˰d;foG29rr+.vR! I&DK[i֎^+WWHq2x@|Z4Im1^*8!@E}nFבHUblw9ًog:dͥtpzX8nTosAEV &5D2&/^Ѷ*hZjZf- E)U ya2ĝ&'̱Z~ߺ$0-#OAٲuiPnӱO1oTbFQd*aEɞ6K#>W¥_DL}"益Wvjbp¦\lKoF 68uF '"88-X8/1G%%1޺uxC3f #~إUcB@(t`ʆ1ِ=:,;K$ {rpv >qڒ@|^ j;p3I 59z vrm{fF;{P6K vPŽpCĥX Xj' Kϒ־!)#յSl2u%K3.˅>]f8rwh#M&1ZK΢A'`e]TGds냊 E|^/׏7NB(CL$A LP(e=mloqY/h\Wj\GjB!#̮Ytbi{Z1PΣ(e +ml2nCvy1]δ%"N!^xo礫׉kHsPVA'K=v*B't; ce>!>~zsgd}% ų{/UDcB}ډؒsbpU$pGT dDmd D2AWaIX%)gZm^Ѵ{fwAڼT_8Duj!ʈ (0ikEAJB Tȱi+j5jZS"clW*W격.\ZӷhM7ۺa_=mu3p\VWeWՀ/Eg\b__^w"&jJփUKf?ƌ5twTC[ͷQjT`ɱ4[/9^8u'{J,|:qJb鋣XêbIî@[ r굺Do#$ϋq(v IW\4bC鬞܃B3, IR'QhZNB S}z3ugg$~5ZĨXߝv }׺ "3›J08Dn$zWl*[- ̢dC1.@yq(tf so,SS5lyF^"SzK!CԝԽL"'97ĄDO|Q`+goKIy;,~33Cz%"!`~Hvo"^$q'\Y(<9zN'd*dR IB9H Ҙr! aicCOۢRv\G hr]\UFQVyʉ vI@DPŤ@ͻsw`w5ń IR)i (6v#JKٽpMbEAM+Vh3ŋ^>gASx~%Q p}7~)gjrKڗ"8jgd>J .>s־ 8̓L"Vyn$a8-V= *ꮨDD]0ypiR6Xɯ̂cE} lz,2js~wVa=hX!C`d =mtr$7OXaoCEy5@zh)b'AYG.vtphj!!@{7-)wkuvctm#^l I0@/;#ωOgteF$ AÏ0ad,]NdźX@*X2DyW`wVB5& t3ypJT1++ސߟ gSq!E<8orGE"mehyg*`@HgVtzB>x".$&q4^RљsoAi5kvH(sG }#ьG/b[Ub׬rXV+ˣ4X=\bwfr|irw==?S~';dF%Ǿ3 D"uc.[6w+S4HSu} ՕLFy41fk{DZ7GI p H[Ӟtgj]ܸ.]{Hy>uԩP^Izy⷗'mCR1>m>oMí$G-Nne-OI1&eP)^A2j ߖfi88|k1r:RI1 WXz\\tY: <&q̸ Wm1,p^G5iU<rW7#u7ԣ\~}UhI;Ө 'ErIU,xInu0jʡ"J1䏟; 7kôJcD˹gi!i|xv<5)k>! {HHW&>eYjʕ#oRGSA=SBrEJ"ńՁAE$k U[@Ycm0rH榵U_zHy5Lj"\Қ 讚Vj6]>D\r ƣ/r.(J{e8*ѤUhKۺF6ۯK JTkzQg%뒃QuJpIژO0*e_ w@HPϿ".o>"M9&qQ$\*QcuuI./G!(c@*ںsuvƟ ȇx,U7$;v+֤hYVEu6gKj2oGץ>On0xǷh858;6|eS:}cW2馁 F##HK{ΨNoI顱no-Hc҃t%rB~oy?%\Y%x1z?\>{﫿|z_xW)M8Hm>W>p%7HJ#$jXs笡&.kT^jR ceAJM0K`ɕp_Y``LiT t4:)d&AEƌ+*g I7ș呏+m2 ncwdY"Zg 1Y R[ٙ3êDRKx3#{)Sݲ>d̚ 1ouލ bڥP0L B}ê}@}q+;~a)Mz-MJyaoi{wqRGu֙_BIXu#zTNtE8}o|*MLХh'L6z®+aC pՐYSp/?]RS"O4%}f+|1x:]nLiP0 T<  Ý j Ì'% EqHƪܘ2wƔ!΋?&SNb)Fb)8ekFYk㖀ЌH`Z %N}Qm 3Θ;P~RNݧg 6]{Yrz!4ل՚!ߞb d|2"P ޠ>U 4 1ج\+KsTZ&p%l0!XTŹx`󞘳&L"Xֈ,*VhH`|l{:/WTe>APN^|W^ w ():2T܃z*[{ 'cZ<ɦp[d{ZMHAL|2q$yK2yMǢ$\3Bar8YI#x9lM$x<;$%)w=h^E`,5V'< R3jZ$URV ǁaІQ3*R5bc0؜ '-,TqM[J8^EouH"E`r z1ҲRm Ԧq"5,R\(5 AM+Y[7\+J ЉXy/[ᬷɄRDԂ諾JhͨCN0 /ʜlM0L;S遪 LJjI9dHBڂi㖗Ja8!96-͈vj/F1ABoh.<85]t<$9W묫nсh֤Fd'Ju^g=/TEa[GX$VTyCrঈ ^A5P&~&n6 d!*]n-U19CNnW6?6ӹÊ6Ņ`W]ݏ?k=]|ƳnF"ix]/Z.hN>-~=ɝNs"όN:h'jk㟀xVhT1mvmΉ5)"ъDnt0?| wxߍz;?hn֝/4{' N~V|#jS E'4SF̔щ͔=L1&5^RҭJ99 ҤVs3v(Ԃ@rV3O:šN:DRqѨtዎ-,]*jp@Qڔ,brsg4tٹĩk-|Ւ>;gYke2*A=Q @0qBS`N%6 V1TܜPʸ=õ`Zo 6Yg0MHYohz_ƅ 3ΥC'fD~xe=3<(__g&yrX_<=w`Ma!mvT JĄs U:b8KmXK-țȅ҃p8P E*RFN1N=d%:-6;@aT/Y>BHQ eisxJ\r3i0'yyp,k>3E3>Co0 )`I,f|;r㖗rI&xr$i~ Oڹ4YiC6\]$t_w7*x)v}8LL2y:KfmW+j_i#ưŕ:̯疦{|Y n#@NV?@]@javY]r2]ޗ.M,㖗r%A Wdǜ7rk'PO3~^:[叭u:h@ Ʈ$];HwLk#E+hyzI鬐5?r[:3O D_[d[ jD2bۗHWxAJg"%L>~!cs\0.&Me!b8:oq5K=khmOY]1mlfU]gl_w_& K+Pk5A:s}>4f^1S[;"jQyq\wm +ա%zc}@I +-HX*oE9^!hrt;Dy}ZYj)"xM> Iէ:d#ɆiJl%{i'<2QOvb;r'ӝp4V^=:&R;ۧ=E9"\^ o7bcP4UcּKNkNZ *+:̙ .(#6 iRz ٹdھO /4M*ZfC1yg.gqܼsJgNtn+^МZ-FjvЪt-,]}\vWZ?G񞚝&ˆvr*$hwa"ACirQ0 N' *Q^i57&6[Nhi!Y;%>/bCiJr-ީ/ IH*w65m&7Ѝҫww6 Ǡ@tQE )Y `JK 4Y(i],|T=ƣ'6б$ SZwpɷ11>\qgfSZnBӳ2J(cZwmL² nv+Zń[]'巆9FVnWk2I~ژBDwe|3L<OOVlJ p;mc"!볷t,?\VGQXL?l,Hy5ڳ]-m^I/u ׇ|Û*ILXb>ѴṁLEFI=#Dž Ͱa?X7gfΪf 츀~Aе.>O()(]+~Edq,s$6 C=:"nш GҗH!P>Ց'_2';s+ɞ̖lWP> +ZVyd.=7 H3K,{~;'K@Q_=k?`JrŪyX R@@@o@_ }xV<'Ȓ8ۇ!($m?-k:[P4OK'i]Ў'ې%7jqޓ4oCI" M"_dqeש6iUI^CԚ30|#0w,=+3T0 |@xPsd^2ZtZz۟Ň 搝r8+$iA6v/20  {`bBh lR!(&1 NyC<x'b9+.@3Q*U{ 3?@rMP Ÿ8AxCJ\PLk=_ԓQ슯 ʅ- $T&|Q #~dj+u&tV{zMaktf)k6j .39٬`CN0ɜeixuVH#`uD0[sL2ʫ:Mm,\U2 7?V~n:\ZrJ׋QP<2 3'"ʍ(M?WeEÐ :1THx+)5D&X\' Wf6Rhr ^p\UbSpa0k1B>hN:&E:BfuU.V*8g٤-*Ii7;|pGJgKywcQ{Ab,ņ\L~5EB}墼\B@Ir\Z%1ˬÝIQ{RCw10ٰ}@ \<|H?.D\.NH,-vs(PN)('B^- U/"mIe;,>0yBeү!˺.yN*?"˟vx(VEzT y{Wr`(D2(*xLdfRT~&xy֚;fų1bK9q2kYq}PrC0χ㯋Xelnsm{ŅQEe[asD%q"uY+RvpyQP[ "/!^a<vt2d!)YIqk/ L>:~kҪlxK 1,1"dC&@򺄇~ *z9i7iAYvӢ=RRgR`2\?H{&"%#u[W'l -+ϛ!-|7s:R6j:a,H}4Шg#ꜝy*P7 f\9tVPWЅЁx%:PQlwJ_;v5oh7nn񵣫vGؼ"~O|Hjx"eeNewewewewٗ5^W KDBB|#N&TߚC>>(7qpӖ38|kyD*֝pή!#H&hBSnt[H.,;c$s{ZGZN=@kʉBPr(}P _r T$1 Hb"!Rk(O-5ʮĚHk7Of&dz??!.u &c%[?7|ƥ0$__k0&g^yWFfKʝ}}JTA quRk J}]=iw݇hݻnl2E\rv ɍgTNpS sfdunS{yR5nt9?{ד`'@RɷW߆|PuG5{ENJ(̪M5X63GɲIZ^LI!("?M$:NUsOf'}W=χY3t4 ߵ&Mި>}Bt{{Vn/oחYkQ=l&mLokuf`ۉT6<1Քݨŧ"-s]7TK2EZk[wf3)q=jS}o UQ1pҍKwzݞCïW&_߿1^GGݙU6wx{?450_u_: R ѓ|u ^pu56XrA󓦚\_u93/AH^F5}IOyLM2 f M)vNzgټ\ ]!{?cR.d}#+~2^ݬQę|{&mtYj7PRu[Պ빾{JϋJ$fˌޱ ;jlL/%@ /' @GJ.xNvV{3iZF5 ԁvNjB[c?\O#cBcdtJC)=cURi-ͪq"/?$,N~ơc??t:B1> ]BD-e&ٙ'1[rhU;#.sވtC}5fէ}2Qg O(Bk'tk`2Ȗ7iyvm=ÖV͏3[vWRl{' GS@qH}C _ď~a(QMޖ6`kW@l%%{ٻYK۩[hs̲NMzu"Qa`.FY)0R7[*W9T9ÔW7̬bZzBݡ Vmb*l8~.YHc߮d?8\.A SR@ _'t9婀 Y,4U%\lVF+nzx (>P!@tW XjjW~՞TrbfwD[i?/3mńY1?\c)^ ""H4@HY6ͨ!\"/޼C>zw,aXHo8,z3r{毑軓pX%05;;r)66"=y7IqyFnw&1kJ~WBBq UȽ|Ž~6APr lZ{- zʲս5lH*\IWH*tLjd.'9orDvJxb/&""(>)KҘ,e8SJ?Ix!|Vd^~y X;FJYQ/ iC^[v֦A/Mzݒ2zl0ZVSg euiv b_Ov6wR=[v:KBX[Ptjnц}jk`F  i> v(,͟tp@}Zd"-R FTB QW+a wC Z(X= VĖl/+u*[@(ࢬשzR}Z]Hxקdp6j;р9`{L22̈fyjB?upAE*8L aOMd\(ITH14g_)ṁy|O4t̆ (Ot;Hb2GH8(P= 0*ʐ$"&3\GS܆m6 D5$ xf$ߚfCYȻ2RǑpo]My4!msȶ c ɜXek !?tf =(#ZX BJ S<䞫 z H"~R  . !CR]ioF+bf,6,% ϱ6Vg2俿Ք,S")Sk;F<6)꺺,9(Ɣ0ⱗ2r]A@l2EZS̜&B3#c`"9Nkm 4kx o§B /@cJm[Йt!b3 '+j(Q56㛖ĈW6.}we\/C0,8ut Ь PiB-?ΗAG@t tKdq/iH  ~E%˩W>p׀i|vz{U;%Ԑ`0Y-]=tX}'!?kdO/AL}%>lS{3K1b/`}w`b윫-x2-OsNo|#p 46E|[L1L!O/=biJB9:j] <+ lRe{!Y# LՐ_Wrf|(NBk>Hj ʜ@'m?B!DWҴdV1<G#?BPoHd|[>^[#K /!D%$Ex]F*cF4yy:'A&oʑAy!6$(C|3+9]KK8_@>4痏5QZ#5QZ#UBBg,ZK^ϐQyfoZ*@LɐEy%=y2ÈHF %BF>p$*^ 0ł@c-.NYc C삣-C|4}Q#hE \ (R@9OQ SW`N̷ԙf׭F#9(rML9!lb4dRAD Mxk-IQףzȁ*;Y䚪F!? ߃----7S*Ez%8; Z6r{JWQV!|6|M}(-j<W@@y8\Hy e+((r9ht$'"FPR0 B TY 2H#MT6mw{b}RR>ۃ ~}*OBSgO|&/h/EVHœ 'V+h_pv㖇RoXBQwZj-:"E Гj( - ,K?Aۋtw`<2Py1y5l3C Zӆ)ZJ P4^JoM8C:m Gv1 K,[8RVm+GD"pu炃¢K^I^bT[LV6C+P f@-ζEwxOǡj7p";T蜪>qk[ɱεu9ݽ&l#Pw%*rThMxbAyXscϿgZ@0mg Gfd=&nsuSw=8UQ8NX) 7FY#On󆝎^jNPy\\ML2aUha}L2tjꁕr'bzQLw-au?݅kݽb5Pe| UG/wo5~ԥkb7idR>S:tr{CVl2މ#m F Q*AeYoe4 {|h. 'xdOVK%̉ BZ_R_b̎w08>yVŕcOa]Y2C@MX/: 0.qߺ8' +eëǗLxuCq ]MrVtdk8 vWųp]^gk2&~!)oWge=m`e_3j[VZR^3ik!v_ ONM͓ yo֎)EhS< 3 w'e8\;~:VP>͹Vʉ]B+zƒi-"LVTɜTi޼ESges9$GpCLpq Mu;zz&ju!Q1z -G#c sJ~D@)0$rSk&yM}O;pReG8Vhq]gfI iYj:W T尺,=4h\#V{;d+Lq'EEiSLuLhE5EܿܐlYM ;"Vepj =ʴ!W!)+҆\#q!(zJ25hSw`$j\X`Ȩ_đ谴]zX; i~_LlY>9N0DZ% ;vNǬX'w#Y,eJJef߾xZJdqeQ .I,V;-;R(iMtz%/cS[+ep~a0A!֯1/I ]YW/gkIߪUk WA_Ø3RřTQQ? ]$GH^#/n+ƚf КWf'$TB33w9>סztv=oWK Kl] M9qkwvr>q$x:ь'ap{?]f 7!Z jv`w'o@f6OS˲m,zdЉ /oM^Wzgy598{* ǚmc҅DWXqņnl ?pbc /I3ռ8Zrc!g7Q|mA Qe#PA͏Cu(F0S>]#Q"LCת0M1K PGRhPFoK_ ;DPسW !rQ (GՀrcE9{ÍɽbzJo"Ji˽60OewY_x>'*q|W}&!/{ oXY2;c凋գ8>p&xHVXOpYdE0Rrm=/3,2(%SN tjh,ؕο0GyO/L>-{7vK[.P#hEzV* ɧJ7 D7faĆ` Kx "rY$*R-rś+Y`o<< r-wQ󀑕;#ڪFX9P 7Ecf,:*6;'Wo8yp$ZuR9"<@Y#,JBho*5N nL(6?K(ȼq_GIl空d>i|/-,N,㇇_DH{ [ҥ_)u}wqFEʷtOe WZ~W?sef<3 ~[hK\ "yŷr1$H#9!gCC8ny(h”!ڊdJR';0٠`# }'uFU:0!msr w#ɭV`.Hgg֤WW YJxڡ"کnuupbIեNĘz:TpP[=&C3 I- zYY"r #++%ep16?Iɸ{s5,Gr4$"RMbj$Jh/PR[P ;wlbw;,es;+vvV\_^ pK&OK~ Is3킙ܹy&/B|\ &k|:|"D 2}m qUyCX U0OPLpeYj_!.vj묎aſHUKIf4њfI'1Wd9Bk8OZs &C4pCތ@I"H嬂CWX c "1} .-Ahda\Qe@84}z>DyעսOwb2\Z林1CwDz&瞗šm?t?v>=vH1NĿFw`!s;Pv;f9DG0N 0V^frp[XS;iIw*ps)lb{wlMfgޗXdGUjAGԙuBNM'>,!1Q(N njèJYb@kqJURj>֣QШTvde9faFuuރ=]fݢ_vOa[u3'm$Y>_b֠ҹ\jY&7r& T}KW#?i "~uI2r;nvgm R\%4ߨƒrBɽ:l>g_BUYD^zPfw*Prcɵ۳kV.4$!'.ud rP%MA~Pڭ.!Sw2ڭriEҁڭ 9q%Sjq]swp=jN5hc δ[;@քV)4tсBJ%$PXK:g_O'W6_XQJQKy$#R%V` j\Sc g6AߴCӪn:O bž5jd۠TVeiɻCBZE M^9e6CTi {!^N v6>7xv?m3?`i,ou3=OfK/^xSa+afSo.啿7Wiއ)}X㝉'waJaU+Ycǂ"xLNCveSKh=F9;*G/&E3VVhq"x^ݷ9v]ˇBcOځT靎*h s7) Yn56gOCM4cۀ@|b B ]0Kʘ*TDxU;7LwwJd,$RuFRzwR?2c!~q~X1 Տ1 Owb2R R-7C2ڬ=KEppnIIS5ǯkpH񸄲V^aԫ^ZEA>fޕ }-YaJDVaY;ZCD sӫҝUI"(G#Jv|pj9 Wko<=@*/8*dGu,)x=7DRԿ~i*ߟsIF 1YD]yu|Ks2+1if&ݗDQH1(ok,qD 3տ֗Ѳ'1k}Twj bkЖV#7qR{b+F;mK >/=\YMvђGB-45׆Ҟʤ;4 Vr(6 ra_ޟ#z-dB0zK!a-/4'5f(:A]!cM:p; J߳ƀ!HXGUEvoMagWa) -,"R\TBvW]}+Wa/?93h G/i#jY.Fc!,a++;=[d50j_Ѱ ɒRd=(':`Rm!51Uj`{dJI1ˀ&{+x\S͹_n̶ "k](j_Т !94)M9\'0Ė XK4 EwLf:S5XS͸n_gsh2ͽ5!$J [~}wVS $cn}xʅ)u@O_s;uHd߿ۖwuxT )mzʼnӇ9ui6_:I;h6s4| f⭇:lo1u(,E)_ ̽#:zn&8k@: 6wXGZPggZYGsqD)gZ-;yq^P>9@TyӤҰ ʴX 58GiFѽmKo{JtZ=+-eGi&I6E:4LPYCIJ+m٭GuGBZ5W47cckZMnvkfƉLGT宎@hOlG ^rJ8]C E uϧJWϮsoُ-fUk촶3M}wP`6_|6sB)3|t ql?}MnU &Sn2LFH̆fg{ٝ,Cr9ٌoWRRZ5{1eL}g!?5骀-wyxI;{I_f25YJ2*\YsNd!aI'u PT5'w|z*Z4RM&OlZZbyaUah I>_0/t^h J\ VF{oP!3X{|P_wWލ͔mOB6苙ۛ'+K2ZX~^&N ^ũS%{DϮw kx]ëkz#\k$I6V YZB$Nkń'Y:9c,ukigNIW>Zu94/dc\B#?q6D8?:/#՜nXs4Is~oFu[fwQ} +G8`5I{KdȤF66i&9B*N=k<{1>,wsjzVJ4]%8eD>ڣ$qg?vIW%?B[b"Ħq$QxI&BeܹFjqH8%!{͖*Cf$ע 1ruDM1F"DcD:d8ss1eXcEҘ́W& f 0)KH"H0EZeWɄ9ĉ_l3Dk\:`"\MC.U+mK\"@O7؞yP[KthqsԤbz!)1XlVSg:UUH+暕dT +VH01 [0r4{.sUSNF)jmpHs ,0p(mEXy@`)msH&F ZmȓB8N #2)F,F+J ZF52Ghh3BjB%:y(35-==[r8GT(7<4vP'4=9X5f!NϠx[-8` RA.BD:@DlQ\g_ glPK@ [Г?PpX]7g_M9qПh헞 DljV}+9Nd O@f 9޽] {{jޖ顔Qk%N$`l|X,mFpoDz,>-_0퇑bBheL5לsv Csהpv8 .d,V;L$ aP!oΟ猬`ÛngE7fЛɤClr}vu)ZU-3`lS0E c&NoH"`` =9(CiJ\P`3D)Ixbu8zCCBPZ  2D Y-b0,ЎFykA H#-9kwhRgi'u%5S'u)XA[8&{,1*U`lj1THLtJDy5M»ʬ<21tc-jNwy(Մx:#hgꙤ TE71U \{k[e߽Mҝ0~ɃÜl6T]irs+2`vw+tlb|n?鎟-rr/<;7{j΅ \]>f̛xǏ>^:DDaO'ӥ0{p2A|;L^xE3ZBl8磀d[Tow?gL U(G_jIUEtil:!*!fT`QZV05J G W:/wW][^\BJ-1wq=Qq'ꑌ\? fי- G{@:ۏL9&֏ğ2N\t7S "G2xz [U2݇ɤw3@t@]PagY@ʼӇmqgl<7S%̬x0X9m%85`b>jݮ%[ܰ*rR6_%&%V[>'7TYds( 齼 b24now\ӛw`KʱMQ!1i 12Ø?yӀfON*Ft\Biq sv.OtLS>lW|m;|`zVd"ly%܏o6[[8B՗=HAbRt`mU)ŃĪQ iG2OL 9" iH.a(-a2F6ym@^M1ZLSuұr+ĩ Ex%HQ|ƾ)Rh$3PߣL(TrMC^㖎zeXQ9*7b~cEW9dtx@.}Ǹ\NSx }׫ ! NAA60%,y_YBtAJ1OO o)?ͺf ׹דG)k S^[8 Nv;`B!v(n6/4Ed/ '!k=Χ~ ̡rW\l1´k[c oMD dV)m^tH ec{~ũ@d S_UZͷU.K̤ho?{{@5a `MX!N,g RaLM^ T:hz E. alVoT Z-x TivK$n6E"$ҿNuCl/㧡28!ƊQ}j3Rjy;CBQ.(OU3>S$\I[*8ex)tJGRI{l'@4VuJ45/ٝmh+=SUSTTbZۣu :irSTZS4xX+Nrq>"ZFk!%ӱ i3'iw~"s]DLIzwg j5Q,U+^=u:64 ~60i}ױ:V]g-qoPDR]@'AΥX`A5kűOVC\p؟s睫x4ѯݎѩeeF .`wm0B~ٙކKg0k-[cX㥹:h>(pmL玺ThJ Z*dSrI#u(3/U_TIz̨drQ[\1Ν[cy"7GE =ӳ.+ hkz3H@!!ǔ6]B7yИ ޠ9~fA}f V1%nw|;@TgŀlAfblBߝ5L>3}Oԉ~6 ŋ?h%._"\7g_M? nSg/=ӟUcznU Iҗl  0jfQ*ťZm02TG0['0_G^Fl+9Nd mJ.a9"NonNN k4p]>,vRrӰq<2|6ѿ3ZHńʘ5 ǮiB*tn(켳i7 xmԻQisPiv>v˛rdz=/Twհ(2nU*shx}O#0dCDpc֩dfx&Nr [RƨOR1Dh!0S'CT23™ISR$`sd<4x%ɩ8j@M)sDX<K$fs d-֞b\%C0{n^D`P>,6aoo篗]G L6.!x>4D*İ`ڀ8xrh$9F. ?{w`.Pm7L歙MGugͅH$W24rE{JU1E}Zu^hz Ƃ+ X h͘H00ZÃ(R)*ýFmC VuSUL1ހ36 qU_6u/hc 'j+$"NAP+lBJ@ MS- 3/]H*ENadGz*"|asJ"r6 V0NY R[]%"5>zrq6вJQ#(8*M:\A?AhU-W!`a"(HK EIPD8 Kg%+%'e⌬ӆ7n,L&ݸ~YL|ʶB<ӹ:ǕFx6?9H)ҩ$V" &jM ”9bvyhc#r^2:`aK9J3 ӞpI n i(B;L$ t$ڇy.p؋k 1 ɕn}"ZH T<lR5@dcxx@@{^eC@t#|^p4]P ~<~W;'Ȝ;7WuSL`d;1L)':}3L48TKHd!|Ԟ<BNR>ر'ՏSg.! \}&W5ʙP|Bf'jEFvjG)"T)] Npj 3, ?&!Qq8Z :*uZ(X媈)NYYG7QX zLjzI-?T"afU{BēY)ٶt/Q he.raVY`jRY)o 8E OutVIN7xn׏^7RKn""NHE<``GFi)cd9_;Q#uJ$$Qy˝`*OgІV %FLU>DHfsՐ"絃II80P0I%Ɗ/SG%GV/o uJ6R:WlBuۤJSEn_Ⱥm3 /L~zX{2>35]T?P(=BU x2g):2+!EN1%CX$"C(vN VDp@ejLrrJTKpRQT |ZKEe>g-">&.Eݜ왜ua,d.҆bO5a֠1}z@W0R, Az89 _s:5NRS!$ !sIlR ī&F`'4fR@ 4?; =vtjւR/lG{AHۤ^vb(=[JDkЈ֌* rG(0/#beY1kZ9wHD'0JIcb-Op+f8&8Hٌ JY7k KϪ T!(ޱ?/t>nwSB\0!gBKy$vQjoף=%n{[ea20[06 jC :jέXwPs}ψ*$4s?]{y8gF$y&q⪖3 w\WqD2fLӓ{iV1Xpf/Pӷ%"xA:gN=?ڃ8~53_`uP>hH*|t蟘J} ZpжޔtJ%JHIJܼĂ3l98MX TBFj1`gQקZsxbӖk$y Bgt)4K?="yA5",y? 6| QEM-|QǁD'+D>V^ I.`S^*w=ft3!ЧaBgLnd.}ʘ0G0ﹼ#gڟbbܓRX9BW掽 qߜE+;?aܐnB1 ҭ*eXjH{TO zҭ򋖕n9/SkYp$^R|JƟ_n*0v;oK6A5vm{5Ns&Lل2I7_]@?F2Mv.2ݡ {> :(55h02ǽ x)0nwc>?"7݈60H [„ G3|w2θy'>T%?d3 "9|*ܴHFƹOӣٛϯ~[`ԧxi/"E׭ Xdk-읏Ҩ?hW> ד-&&}.\}?~W3z//^w?gfW[޲GV"l [Z 1kq%.ٖkI} P֪%p-kAIG \nQ?|٫_ mͯx?p|= g[ Ie̛_^_j+\w|ڼ19s^n_{gCЙӾy}=p]fP9/wFB+ƽv.g3@^&FqV" 7癿.!/pgxקּQIC9,7 H"MM5s;woϮìFܳ]شbbӍ{c oijL |&L+}5\ Pg!",s)◫3/{^aٻߞM>>04P>} z |{v<&U=lZ]Xqivo4`iR(} V Ag TnL)] Bx{i;>?C mrtpz/DzlJd<}`u} ]:6}eTⷫԵO'ߞ~ <s0 2?}|v}~q#_zݷްc6AL/AʵNۤԚN>~LL'ym/AgŁ7l6oͮ--/m{}(, 5hUQguPM烋`!o3-ke'z2=.|=uwޗDB*4-C'JgJKfُe/2{:" ˌ}(c l̙& Ԟ͹ F$Vh֙ Д3㑢psF#ǁ2K߇o{έ{CMz^>sl \˜A;3[cފPC8൝\Dj$ :n ̡ {M?Ռh.F"tAW9`Mu'OݶTR=4S*gR{_Jmykd\k&ù$w[*QDk1ed G?IEQIuRT+~3E ^A,"/=-$T)*r26|rAOQbD]ypB8SIUL*!Dℂy"$ʹ PK[6m\0dwm Jm2MGض茒i+)6&Ă ,1ՔsbtJyeSJopȾYtl2FG:y8ZeQqQ B" d=nf2edw)K).Q$J I`qpfaejXƻc_ᶗ_a[lO_Qacᬍ.~)L`@1K$-&n7+aF(~y渴,n%L/ER臀KԬF4鉧0=s &Ŕ8=# xHnL"I*0(XQ%Ule hLxm vH`zuo8m5(!vY6 FT<,W(<D[: z2WxADD`\Zكn\HH8ZMדStTG080iз mdь QyP<0_0Ԋ㕙eJA욪$o05#.I> RqIcQMgغ,#0&ˎl\˃ "fxw|k@O &=a4|?{ȍܼ_âY3 Ia0X$[۶Hr=ҝ%VJӶU"C \4%ñ/!_ 7Wp) KZ8o}%|dfP_x7:>@lzk$Y-8軧z \8@)$O[ZH |2 n0\iOShefv#E v^U'hwEU3]w7!a<>GiO*]5EJ^]Y?/K?>wc[FyZ2ԲK$wVk]# C1_oOEYv:缊xٟ_G ͲPKUD+X[C3OXq5-<˳s}vV0|%sJɔm-_A|貸&Yly!Ygm.=`T/,Ng}(߼M,MNm3@"ѩ9cdf< 2 \݃k6 Q.~l_51"}QiG L cp>sCKѢ8=+b4_״ED/)2 ~kpkuے~d T _o.ć6bE6SVk\O4x?޴jһ* ܑa鵶KeI3zˮCࠨܞPwkv:&_J..Q-{CQ 3wiT{(Xd\z((µ.+'Ik@XqY̙1{8[Ys|E(- .[(OG3tO ?.^uiۍ=X),ӃylM%shNH^ͺ2/^ugq_LZ̘`$:is/ zARo0>H_Mf\tL])YqD;uyF ,֌DJ #toB0fE_7}3և,ތ3+YT~M Z1V+~D30gJuG;~H+rK-EɊT+hm%fe$Ao 1OCs@|(IoZ )9ʰjȵRMЂ_Y(!_W.El- 9sL" p9"ghH5`tVh "WE0)B  ^1i s*;͹Rp1 `Iv% 4KљwT0)٥īj0z0ЪZ5弙f5 ʘH7 tp+> k,Mc<Ǔf2(vgqĮ?Zhpzx?ڙp(dw&;4e4Df=~fX:kDXup/WKXeq1dSO2,:615V{U3ֺ u{En{K(Nl&L_u3M?8>QX=mbgٖs!)}շuFaQ[H tg rLiO6L\"Y"-J=[k>\\%=K=IFZڔ҆+ Rh L.$Fv(4ƿSrvB:mkCƔ22m;b~HC! Hxdcw< 'q@>dEi o$\(w1`ȅН%m؝Xk-"^2/"6%qY)ɲ{ĭD5ᐒV/{gv2 >nQ]HqK63a'53tM[Yw*}(٢, b2n"88Z.dtRsLJCUC8MaQB{v^B:alĆMgQڭKtGSLT<&=ԝ¡Q:݁åzvMݑr|jrU4J&Y gX BT'ufxNmvb8!\E;7 ` &K^(}EXtt{ aVG7=n%p Y9ޭ˽FݒP1f릁wZ6׷0F|ѻ?~,mQFTp@B%T-橋aBҤ#ur53%7oi,vLR!E.U=!QR7|RrVǎf(V:e4/WѲ]ݴVjt{9p|kB2z}%pHÿkSzd`4u{ЪUK>{d7<W)GH0B(<}LZi|6R0g+;؂ch\;" ,`~ s.Yˍv/XZuYKiX ֽn_yțC}ݟ^Ā֜l\T eBh]/VQ%mW2m b;Uca6--/IcJ,ɯ7wc0 eCD\:9 uUyN'6Y*v a-;,o+ f~2[de3Y۽b&rc<8^ zyCsK~K|zvj]Mv9269J410蓔)L$~05͹R9gx.C9X E)g`G"e0#19Rni04dF!؊478JMޜe+ ʆ[Xƙ<<)E%x=_\od6ն۫ 3YSuS!(8Je; W XE, hkGk4h&R +3z06515K}mQ޸:\[=--bEȇ_.Xx*arࡷwo I٫+F?jOՖt2y{wЛofd6Y~| VVa)vۿbSjY߅ǪŒ$aYo&LڥT#* 9;2yTFΤdc5UǪ*-4\eP-6p֭*Zb1$m=L>mYbF7nT&eQR(Yc'`Ds% |갾V&榨iFo0ʥᄂ#Wh2Q!19OS-uZ45^u,H ֠\>&gf* f6SPT,#2%\Qae ;r"M wwdq <&Zs :͉A~Cƌ 2,>^SɅC\32@ݑH5;M>7/T)88bBƎ|k 5I~3E@ q)gU=saKZPY3FM+q؛ :kx?)o4Dh #|egUѳQTw U*¢[k] ,((uKԑ'\>;3]}Rn a  !؍W.'o $۸ kjE&p??*g'"||U"tb_-W.=_&[*Q\5C\^m5.-at)6w]Zw~7Tǯg&:Pt|lۑ )"P XcH[xkrzoikO[ÿwtֻ|[ߺ+i3w^ƣyZWntua5͜]=d_ojtO;!Xaت1ǬxVOd3CX.sT( J!1X@{#IEtTh'DŽSY*cVK,eYSh@`3 +U!|{sSfI}D !\*N[KjBc!G(,KF&cBEJ\85s\+;4k.5ѧjo 36#p"P C,aHa 6#\d6PjR by&X*|fBR(^2Umg9k!A!7(]( idgHowي䋹 t͇sy{ukCiAbRaL-  3 {A0Wd ZYg6H "AozCsn"XQ6T .~Q+. zPj·E;qvלfKb_eևܦdlO_Xx{w?Y}~/cipHHv Tw)E._ 0ZXJ؏ ,@iN|l+f6 O#7CLR.E.Snd:E冉kO0w"9\YEj)ERd- 9TZ61 qh:3h5d"iO)%! z0$$XdJ\ِr#k) PHz(AC6D#2Qx1z wRaQlvs42f2f Гf Q1FR317b53i^03\# χy>L/<UcߒdB`vp?>&r@Z[EVݓ<)$M|Gt}s)!D]G,J3o.`DYP^{?zSzf-un*Ӈ`6߭ol&a#0M ZOo[ʦBT]o.VwY.'7ۇeO)(W8GDqR#'-KI2 "?)!OT U%*DS#-e6&嬂 vRgӼ[k'xaGOZ- !FX5aR`ZnOݞS~Kx{b+}/#d'7{ R:B҇R*Djt;@ӻ{[LŲX,)^և\~5_No۔&*(ߜ zox5j}ﶙ&׏aMKnW~[7uEs.Np&aAN$b)ю؈=h1B|нCN(Y]ev]7 - Bļ9z +3F ?DA^Y6Xfk^JAyPZ94 r'-T+mŰij@qўQ ©O@.| `iȏ6CV+Cl(DLRv#qxc5L)L4b+IUBNriS"g2#2r.10rhӛ4V]5ZsQƸ#X_ *Rڋ^:;3![f_&K%%A"ݕcDJ?Ae/f>Ĝ#6į}#Q]]i'_R\R社oec aWsJؔ+L.lr r"]~Ph_a \-u-tqha-(%xnXDf| p8ǪK5e蓖cAPN: ^J?:\}</vmU? gYK DrYLH#K49&3n} #8,MLL7Z0y cMf=&I1|?%c~("hOw.~1PͨBFz-+uГ_爐zf{SL.쒻]|i\*0MQ-4R.+qp}E rTQ +:Ewwm~Ym~1l= }8Yhd|$SdO˒,ݖ̌$*VW$8FW]˽897ӳȞ->K:wE%|c=7/0xmZ\cT Z݇?Cwhs|K$h궍+rk߈IV<OEX #Wcfc.ZhƃIf9f} [_V9`d Z#UnKZ]ǧK[2fnH8u%Yڽ*EXCHf)bX S.FRQL" C@sB"R\V6 &͂ 욈 L^.fR7?6ظ% m0>W$5l's>{0Em{\렽ޔZʯcbk +fl5"w]xnBCO=YhmzW&vzE[)XŢ pƫvP-6WHaMr}^њR&"Ԙ ^IrPDIazɀ,c)�gʤmG# %],4 iWY@+\y2HDg&~$U\K -lAJZ#\_{ ͢ +汮Lb{oxX%VK54h1F8Ƹ5#eoaR}2{LLSF{@S &,ѲtfTuMLXN*ۦCF֜@!l :8Iw& <:p|R qΉx-.w4k~EhV`c,`bHQEy!+󔺻ht;sn-;y_M;/h 9/|b W`S!tטK$bf~~9[!Hm>:G"_ۤDs2nc!yl_*}U (RU!_1hsUY̜FYg1oR$eIpC_%Gk$0)=qd-LgUkhq uL&f%K5MنM3=o|s{C*Ă_Yչ9QgY-t9G*+; zP75j=}miЂ GοNRREAILPZƨeljEVB(bΤfpXǵCUഡr+$"ma㏚|lD+ je-|aҝ,.sE7:6&Sӗsi6,ȗncsVڦB[w9ar;Ǐ\s4لOc9\xgem01o7?M t6a& g_bGd:7bb  6 E63̰!% abX:w3QoSq4D#tPY}{nn Nv24|q;k5MՋ\ĔZ;xJ(0L|0o7ԕ ι F`\! F9*@T#"%.m|g%fK:t:ڻf,ˆn]7'YNqT)C͛t$2o1ehSo5̼Z "m\tQ'-cN`K8w'A!Bwѣ,`謕&' !VbR}T" m H"Z23`X3<`^L#0Cerg)""39E @@1w  LS))a[k92j8TL2PWP8&msPC:fIGq&JF)`r\|t8 hCRFE-6F܌R}3#f!e훋N> $`B]?Xb)]:O (DEky>ʇ T]էvO'Uy3ZUiTP`[srH!$PqZx3"1]һ8 W367\8w}76$Sc)M3Hw|qGP=9G!䬙<{tto G91V)0 6pk`IF.D !$r b#(m(GPΆ3M6L}?U[ .>O 0O eקj+Nr8&6bS0ABA 4*@PMږ6TזּΛ}=&nl%M[V%1D{`VG\ h(^RJ {i09*TT-Ti2 mEr <û@O5~ն,=gjg|ve7}>]ݽ)rsO|YM~4eSy5(Z!OL#'Yhohc!gn;٣8%O fdvUݼje,p3G9NN&Ow`8`oFg3'KDUGG"lcl{n<'GTZ(fXmtm·yw,Wtn꠽7~;s{ ?YɋaRb* b01&H!IG/Y@/ג+IqqQuu~v&Vߥ*t@]tF+[)*xhfc{7E%HKY X -baAgi˝=b 6!ϴ[B{X !\)gzVKAkKgGQp !pHg=^BhJ[9hohc!ǚN*ۘm-[Y1W!W}dsӽ s}7j4-"rVRO`ΩjCv05;xȻ B{C?~NI҅zxNJKβ% / fKH!gz ;Gqvӽ?fcZV}<ԏ3jzehamY y" Nc 0&"p_M"VJ"1U[֚^9Ǒ s^y h{z% gFMjC)oS^YBz271ɇ`eqF=n9{w_H)ej#Ett%cَ }1%k[?\vţ+fY@t;ŪnOBx-0r`igmX2%_Nu 7CRNfJGrB_pkK֝-tB[z]\ )8]lB"3!Q,ugeGg6иOx|*=g A=pb&>" TCs^šqA𖸑/935#q4bAۣH.@11S_XjZ;"6u:xW@d"8`{cݭaf2(7&_N?.Kvo!J0Ì`ke"_W`!MR(˥ \xjݿĪމU6TNxnEGB.I}\X h?l"*,:jwmazD)Ʌ%Vr$f9cXKsGCJAVHrDHM 'Y0$ #۱{Z q5pq_DVw|tfVbk|r,{f-U7l7 ^.p]rʀ =6+<ӡs]6)#-qB9gaJJ~+4?h>\7&mYAVN?y8_n6Q7ߗXvK`~~")#O̅[9)5;=4jZ[wSF8h|T,z=]uKUSdyd}1rླྀތD#[1(~ubZf y]T+=[#eϒ YLuE2@ I>S#Y5[ CSȰP|J> zpp,<Ď=N7Z]7t"R}#Cݭ*}A7z[^p$tWwڣzcW<00н*'a(SQ;/Hش]>ֽz N*,odVL/D=h+Ecq ):9%#o,6R+)ex\ )%vK`(*pNj&1Q$ 2jHx8(0sJc|׾= &Xws_OrD%fng;4R>y;9r9T5i/~+T .~bw?)tss{wq_)~ϋV5]?y`yṶ.n:_M"$ԋ<_\pWlP4 6_߻, #W\7ԫ6U78HԨڦ@RdQMFHU'Zt ԉJx2*yaqJxfº9 ێeҢ0v3 54~dG^FJ EK۩+voQeM1D18ÌYp~N;]kWX8yBUn_Fk{$`g8]&6bJuev8>V\qoQѦ\lp^,Fm|X,+$}*'YU/oܙM|mq\ pߗ3"n*{4Y=zbX}Xq5` 1?ߍw ! 3랈 jkqF|Z[M 1Y}jwnmcKF5A2UO(AIbs2~90`W;홝J2J :Uf?䁫2'̯D93_O+'aV(|9H>|i\98]5& y]q^EEuQڼ!tӛhG+I;MЛ6[sQJX!Agy_~={Xܔ9IYׂ^ϊ~w.T;sei&Ǫ,s*yZ\s8rk1!5;OWx''^ N9t΁:s^Y冀Vy ȉ[V]o% Ysլ By#c28;3DX+V84-jJ/akI@kb TAz,D)V%:wύTҔy[ˤoN~8뷉`M|ڭ`F&Lm>fùQ+ҙɔE* 8{[ ;N2**5جHɼj"SaV\ge2&C*x;0{)V`˜DK= .$Ohuhfhٙ娈N5GD GK I@ ?4-Pf"Ecb+9eHj)9N("Sͺbq۹ˋcSgŕ EJ7gWKjSbjSvc+'91o^ybl,'gq`|'ϡ/DL-]uɤ}y>AAÓV6,5%v'Iu#{ꫂ NF iY_0&O~~N:z"*#gJ_"  "`[خLFk\atf7g}0w/ge0r[\],W9vq}%}>wEmfu RH6j~ED :T /3>3]{U-y?w3Zpwĝ1v?0t "Q\F 8ӌLs^Kn|u?p2ױ0{EԼ6jJ[k("r:{,QmP?+a&$g@&ljAC M8*4bE/v"Xp>ÞӨ ?ǣ{ !Ѩ'o/v8HҾrOsAqsF88[ hBS9{R3U5)oj/,ר$cSba98|Խxc?4 wiJI?xRi/[⶙6L`43ьz2- bJ장srKY̝EBk8p`Z8G@؂`4@Bn\1r~sF0t]AT('UfYyTQl"{!&Lt'+%\nE C12H6Pl@7|T f0rI ;G:B+^Qga44>ѭIIȥcFQhR †80Vi@yA4E/b^zaW]膡)y&hwҰsIG!`b$s B04a@t5n/kbȋu"b I/陽[/+6tofa{ww?<,_v_ʅbnߜa<=fgޞ~\mn>,eo3|a%~f/w:B0 H՛"93z:O/}楡s2!l=\ wraɰwz}_ BU{\em+h)ӡxޫ_•[j蚗Oggnٔkv'-eV[q-$1'i>FPƔ!?b~ C˄R麷n才:g.F3 Sɼa&փXlI $}fvm$t_px0J{O1;7RV֛T2JdR}Zm=9|?2a)j_Sr- m@@wDe.X=(K> ЙiM{{^ͣO|56 < Դ(|23#Xl {3uٝm (H` Pԩ7L#:- Fx8K`ykŕE+j^j-{J jBV֊jIf-Q[@grb9D4䔃v4,ۉ`I$04grM<p A`[,1*| 3@)>캐|o@88(P^r8">{V3?f( K@;N1Fd͠!brt@#4ᐣ $ ]{+:Hs 9^J0&h.]h#X03AB-pP#0bKBOf|v^Bn}՘c&ҩJ%JuMlRg $[^XWjc(⏅ruUarcեJVƱHvD7}M\߬.)]%NY:U.2vkH{;r(#Y*Q%\0faG2 Op~aU=q &(0ifig*k fGNtWS,8Ց9Og;K%҆tD=XeSf_*Ý\BQK+3ruƬ !6tE*1/hf ({TA!x5\!:Q4Y% ((k.NpKlw"Q eL7lБ3l=/\K8 L6as܌HH9(jhiq$~ch2ۏc^IuV*W} vPot]&ƗmP Hz>( A(xTUPOðRqt:_,kf}X/j'J2Mu&ԍ567Ԝ,KMG3zmr뼊\,`oWFd|uU)Rj!jRxt&2ENT :OʒX0Ti`eREe$Ϊ IҾ>t"8:hk*H!5y*'L*ʹLT%R Inf8!ĥ/ )R5pd|?(_ˌ%m_:ö hs!YzwGWҖ(v}&`jߌ/sz<0hA0'i ,KYXG-IӜcIXK ϭj'Gߩ)9.9 rNVe<2gK!,юY'5M$RNRI%Vb ʷ+RtWq `d z|,zj:&<]1-vi}Fz)uDOٌ`r >,뚓 0pca#[yîY ԠbT+zDu+s&j`Q1֕D/c#C4_> {A44b/T xN482s&; -4'§EUgW!z=U;n6uSIv}v[)^ %kb热.ޗ2o_[@TǸrֹpKS"H{Ǜ.х˫[b1uRYS{/~:˞\KR<4܈Nqh;ưW;Wϵ'D.Ȼ_Ytq\,mFJvuXĆjK0rmW+\@'0v^JIu' 7dVe+[DF{{  wJZ-2Y^s-D0ͰJ9##59/{TO6ZCANA 2S$kh+6;^P*%kWKA{ږw WAEh=,T|{TI96`TVUQ>.JGZ0قd).4 0q} ՚5TX\hH}]S~3</3D{QQVSeVIu-a y!GzG*y:}[T !tgE3#0XQoc%rweW}UN/`W]Z/%fTzv#NŮ:J ZȘHoYl0DxnlJ3N&M}NZf΂]Fŏub>ՠ؀Q9d9k>ff"I5CBa֥ .>1Ň?}eD⯏tqa]o3uO S]i%z9f]f^J0aR)Fyi1.=0F>}xmŘi 4 ڊ| k vQC=G-#\!;HA/.2 ! cn(&lH}i(m#"acW̴vzF_ @Q\3T##ȿ9MW`vEzHH3.ZDȪ87,Q .-BHQb#nu#aD=Ղ-MPvͧKrɰR@hmi'xm3׻xQ27MtȳXK 7?Gx.O`zMMeTaqyc6? Fnӿ} _fUO]JOnOV.$!\D+ɔȓ+݄vAT}F"Ow%1LnZUnuH+J24Cvy[U bD3hy.OjMU/nuH+*2ũyݮvݪb#:UQF Tִ[JHVr$Sq] h.1ZW &u5FCJ\?_JTKa.u%@I&%PO uEg'( ۺ ? J].~B+Aqz~~2'\zW1\PJna45?'Ա@`S?'Թqv~%='?eJ(.~Oq%haS]Xs_FVPhKj1  o:RaflєjyuBAt4%ӯQ *˗0[P1)KhCxɋQ5׌Ejc֌El@l _Y5NIK4h]9(#!ރ67Th9 jc,y<]H6Y2'_.%X}_=(Z=ێEwzv˃(wcZ/~?,M/ ¾xOpz;JA3#g3퍗/7ܮx ! KI $ϬI)Gcʁ&)U$UBϘԆzq {8٘[ӫ7U¶y/}*4*?@:߷3L7F <ӭy !1hoZYPR:HT!l&eneT?7jZ{|1r'P %J^I">7?xٻ>#L7O?m2Va~{Qg`D>ww7**AewkF[Rpھ C\B0~o=˿X<4.~ DV7][xO "["F#E ;Elᯎ҅;Ž8cx6 /8EJT ` q.OLs9NqY3$gɩPB҄4MQefoǙ& >y?~~ KS *MզF* **@5AqDž^^_~+אJS}`b݅GK&C17;qӫ4,.Gw'ELk @d|02 =2t lX=RSջ*' Prvn}qT @~|.YuVFZCK@.f~>??-z{cloxY 9T ˲[g`1/sJUDƻ±^:‡RCmKOf-g޳}U>}9}R ypaL}{z԰\ڰK#QC꣜UNL(\%:t5DGG+#x5Qgv-꫙ D=Fo]R\z)0tq/Ҍo _zhG}hŗlr&o(s@8}R<&xk8A@skԀ9-B|^$T@ <)ZaQyHaJm%kqk[^:;+CIY)'aDhu+Z#TgZiAK. gvi T^u6PS4\Fw2/p@RDDDm(ٺO eL1p~;G?FbrEjYT\AyNp ,灺s6q*ϝۑzY[Ȑr!MRC55RR%V ˉQT&ڌh:ab{ pSF6U|P+jaߋ!фz؂ >Tş!o5L\g5; CA,U ǂ )}qܪnU xbGJz>p iT ȏwBۛU)"n&.Ӯ(%M-,^8X(E0b!)Oxbg^jg?{ܶ K/R=W)R'⊓ɖj03(r~zH]@@%K uOߦݥ K* 9]Zb @^X3`ccxmZoaV(rVQt3'fc|~ɏT iCR,ՀEP1@(JB: /6{5Z[׋w9qac!.>fA垀 wWz`<_g_ƞ}xzb.Yx.$DWt%s'zJ1exYȻs71}kЎԊF4!Ol5CcєWёCv\G(Lu|Bw-zvVKZ3%LJe"#{5M rfAK8%m_c:"dIc}l"X| kT]wzʚk%s=nucYj3ЂWk*{LTmAUA ѢaJ[ ?܂JOIF%vFfb|teƳb$/7湻|#]9f*1xc'3@ ??֊<笱VkaKqy֠Dvd qb0*(QYcETœ+N}b0~|{$iOJYYcRC%!\ EduY@&M_B0m?&|<[c]a2>̰r8c^P-!e=\J]sz:XlhKR[p)WypD﫪pՄmi B Ɓ~gKmV=^41 xo=[C޷tݦ#?^O"C̕`\^Rfg-\2Q6wu޴`k F'_l?$.Q?duL!l}k-C?ZUPDujB:jn_n@P=Y]+?oގ8߰'>۬G=eSy#mvq߼K/=ABBpe4V0, :&%%(yaK;ϳBտ&jVu?\phb<{s+JUuOHM#奄EՆ^[M&4*}pK4Pk[ƠǶtpCI߅B5k^gYpS{w1u+]-WRmqvY (\QJ ]̭=JΤR=CձMb fc_mI,0y3ucW i#w~4y0iH>3qaks(+s} 9\ȵX䕉9?9%Oe(I&zgg9.lKyo۶-ijKyRVR܁؁T"Ղ)B,@?VVP!bz!=ԣ-+r2 駑=s]b6Fqw/?ʢLGzKIqRD;si2!Χ UV:^oݖT2rJFN%#U@+Vt:0g#wCAL:ipϭ5e뭶 [m|Ao@q[ujNS-i:riب cQb(cBҘR:` ,$uPښ|hP;h ێ>*jGN:n:|$.:hk58_2* 8Ӭ,e LlBc`L1:hAֵ-ݐUrn`%Je 2dNT\ds , X8 6XqaE֓*z99k3L6ꃂUr瑶 elX ʠ,,[^@Z@{SJ }@0.T3_OWȍq*OX?7W2T <>-ND"K xG f[|{yrN;kPcRՊٿ03o>_7p}vF$ox9s&sg.nz:kh[pb?=:? RcMoq\wSG-PȎ]5VZo9Yb`l"XdR)нh:bW<̡(XOI`P!UH7%RI(t W>VM0Ԭn^aH>׎MdXo@8n@tGN'aJd"?񋔜Lmw'y GDd ) ONLZ$i1!c \ Eiǟ!6H=|7JOk[HjS.{!8W1 'j]},dG*nJ^Ujϑ"$2kp[,wx 4SafRCܧ$GP6$TLvdI!pDcM2=*%m6nEsŠC)%F]PH@+lu>&L"Ya-*QO7.=\+(X},*"h%J"*[8WCOM)"~s4Z|=9" ifdfQw,Ьyu%y itI<9ADip684+|YF@8RCJm tNM0",,à 7D^R2&`u$7: EUW7oZc|G=ZYw K1^},. KmCs@J(e ]0rVJU+0{g&ML h>Q(hG2V.˖iont|ǣQr)t=گ8C" BC mXV fV Sk jwjϢ 758l6n") $PJDGI͚DZk#lGPx|?8FUFGǐƻ kuv#qKV?:Xↆ' JI`99(MM)I׋J0v ﺛp?7)o=&LQH` ;86 03y&hBX>Z\TJo_y|-|J4+N>ㆾڦL7h6]6GLH%߼˿75v ~?GwcF x=W>^FdǂG7xf*%ys<~:}ͪǣgן&j诣凿_5]ي=Z f_zbf\ o{# 嵽7#l q}S"M|6D7l5x1M=NlJS/?gomtۤukmn֓C[+gZ&V:몋?ŰCw-*gC]=a,Ԃc;No$L(l+1i >2{F4 *gyw&h0f+酋ŗxGuYSMmFTNP>jTt{Sֺq\mRUv:ydHz[iSpv',5'~R%yQe QWr"y<_&|'e_>FRt1[c]!fޘʴ'QU{W_<kxtϢųi( &| PkcW*)SwFη!Yq+z^uGckkO%ްUgq܆sRL:=oikt&%f~\JK{*)Ob'?`OebcrA1mg.g>xQƾCÐ'JeQ]os4;X|fP`? _\2]\6IXAQ&gWq4ͮ7Oˋ_QjK{rtv=Uɋ寧8]M0&ӛ0a#|VRfOgѥֽחϋ\8J5< #K7|r拻zs6),&~svy%L%I kk$҈5/dFPCByo!أj[{#ĝ{N*`7.̿m i~L1#|jږ}L1heլ/9ǼԿ _rV:;zkw9j'22"0pQ `ẤḵP6Lq7IFN541f JDZƜԖ"<*1ܺ$++K#cLiBiNy,dwmFE𗽻]y^!H&n/dxc<"ȖdSb7MAb[꧊b aҸp=$ۉQk1vr7)`_zw+]!@oAHJR !5/-2{G $9G+A5h t$K>~-)0x6 ] Qk?7oY6t+NnA#d4p ;(Ofws? zO@r|KBMy3)Ԕ{Dyմj4 /K9z'볾MzGe/7w'LnI|G#A;Ub_>K1qX ^'E@y8B2 KdCtq*ZhT 5ZvަVO]Փ]OLOqc.\k{Ay(B8vCѧ޵ =!F%Xr4RQC`tLkH9ՠɕ^QiAxjzut~=t~=thb[_~qtxem;@0#ܫ= Xs))&0R%0.I`xL bpO`aq}Ow&] C.[^_vEtcdꭍoMtq]}p4M4(9NE{Szpc %|Ƨ4:0"{DRafC*f~J>GCs~̆S|1B6+^QPt>k1YG2 Tgx\u<,9{/t1)0Ws氚V9TbS;WPTJ?نlvsO0|\n_z9pα%{o+[mSfo_]y8o׏g\n,FS:0?9ߩ9Zv%F`NJ'xWHRiLi̥3%b%LU MJ{m(FoVL| _k.Pk2 M@2ՓjZBa+^{E-`  @ 9S@;VJ'j%/,[{*"^MK#U*)(a/Tr踙R\^4@D,j-:6ރfPZ&cFbg 3X+G!v)RU{X+ \S!:gEJJ X6KP]y;``H&)kb-Osܡ+$lzYDcY" eaV  rAޮ KM&⼓:ꔘ=1 pFp[ډ :!G{D0O㙨AKf{6޽TB|f&<ɥHyg3c2 h@f anJ٪%hx,U#ba1R;7g΅ )8$nwSP; =׺v(|b!gN(:ZhohDҪݙP˺UdvY5Z]6DBkӆhEg6OeV2!=KMi%H^jNRKm8Jːx}Enngde!\[ᇛE{jͺe/nz/.4f{ 7nbP Gg<"77nYGUtƏiԟv [Xw;Waʕcϟֳ6?U5O[GhMgh>w[.)&myVH-?>On]X7n96%?88<2w0"^D{/s˜=cL+@ogZU0즇SD Y?|)BN?*)Q" ѯ W[X[}MZT&]\WQ[=RWLm Ճ+]4Ш@MUjTy5{m ٴ}Đ>"]SI'šVu:A(,*?k[Jʼa|Wh0yW?+o𗝣HjT>ˁXFyY.>\%;/jןjBaޥ*:*LQ""S!徘)K7ѩtM((jځZu\\C!>RЁ@AWhYkrv@ʨ^x Q7IT.ݽϘ" x(ZLtG (^W$%ξ Υݕn^|ZJ4ߘ?x,,V՗+WU;,C]iUPBw?|es$_,g7W+߭yN"Q젤\ 1hΌ5s!)-{'SnVc7ffL v^, 9&pϼ ҳI yƉ0N3I8xf9$-7~?<9e-`#'V n?i ٗ98G̶LǰǼ/_f 䗩 yaQmG')w苇؂+n} D[Ỉ 'c/0&u}մ`$zU|ܐm_~fIVdAe\cU5`Z.i]ĦC) bФsgCZPǝe.3,@P@:(6# ׳|*}ن!WOG_}ig<,1܏I}BvV|}~tWg/ס|>\z_Ϝ]pA].0doVgO>@º~q(Wr n?,_풡>MTAhT1UA9#/*bW71̣~?;'?F̜_w!29%<@k~Zz`HS+#JrXhndʕFJH9 0*ք:A! Eٶ` F % oVt,qZr$%N"VhB J o)tjl9q/ϴ\ /V D&2avt\=wulRK KoH%@)TC/{K,n_?1Gt=s엿rQ͙ϫgs7]RsAk}l^>'H$?c&9'z^J|-Ln&A@JyT>)uDˋs*l8xJo-728RB'3)7QI?&hitq<ݒ۫k1e9J{3Qk mh2 ChCƐtq*Zmhby|#~Ep" OK8TZǜDH" #4BP2"^1as&Ac!p<:kݗ)`_-eSkMSŦiĦrD=6I@)Lh!eL/q}fzHc#{MIi7!qŃP6QeO}Ճ+gp3Ai{;[)0ٚ YpEd'Z`QjH+53K%N TR+%e.\в VsF@ėM=˸ԃΈ-lL(ƌĒ@2m).g[@6r?`Upȗ ^]$v5υ,gXDT"taF ɼӢ( ä"frL*ƵJQFW7]36Wۍڒo e\g ijcg @X7oƘ)41 `3411f E{,aehMҲE{vt!_I%گEh,pKrU 'ע&& ]+Q٥ *r3!`:8@LFUFEwdq^@iw5p>W46ɢ ":X3Zt MHtLoMDTm8B*݈\ RRYhm/#;ۚUmBqͲGMgeFnN3x#sN@ ݲ_4ջua!߸)IYO(&Sc8x(ThaܻR׃8JTbFD}if>I 58 (JNjf@A^1 Os3d}%O[IJFokl7'D߈ۆDQ@\:>s>^k{ 8sv 1 HIy '@v:Wш)QkKJ1gj-cZ9;LSh?}X4 gLZ q/=O@sA,N!\&=eTb2P^šT3<{<ZnY Fn= ~NV(1j 8{{V}K~nU` KfpN GC%Ju.4EH_4EH_i *g:LASRR%jE *HQXe(m&6 j> ADQۥl4!;T:bL[7ݤ*<&rnRM^^;~ɧ^> AR2Q(y.$lVP e:JJb. JԻ*1v ^P0E FXAD4}0.ŒKK8nNzW-,ĔX[0Rp7J;dbpJvQFT#@``F~Yaf,G9X)>IRBi `Qt;>WzT!Q F 1諓5ĕ 9:Ѡ'Lg>V69YO2!Cpߑ@ "Au}ic~Cs㡛 zd0*`G+tB,0Ts.dKj(RFR긠@i?h)Cݴ)\ϲĮBJj 59F]>E; wuC6ޅs8skK97S%w 8ʽi49>(-4_4,jN-ɔoEy˴mi2CIĘ"JԀ(% gLr)Ѫ7\P&7Z_aO 6LI6ZQCF ߶Y烫l.ۛh(4Բp꺣CmQ$:h0ziD"$]Y4,kl 6k^QphWp3#  9t*4?{W۸/Łrdr. & jN|<`uSD&)jm$Ez# ?NBA)$R[jLY 5N[V oخ D*Qךg0z_n˜mkf]q, LD n MpL;҉JX45+ 4VWCAI6y^MV:;z2J%=\\GfJ 4eym9!L\B!]-yoՐR(fBLr,-$~JXKf6\ aLBY)摸c -9Mz J.g&B #BXmBB3*1 da [6r!o\a-y@T,Zx h!Lֽ-Z(U&RV!(zU.K.pwS)1Rn I42DU3j^&xx~u<߮ב]WZ(-biMd>5LT~o/"غ:1BD[](&hH(?GY-v]͞Ei;&t Ok!B)n: !u"хu6M(Lheq?Ԩl/bt<+n凞XNKJi!ICH^C\..jQXe_81,ILQKT@FmXlb-V"q" 'm{@1^r.}l3n`s=o=+~=zQSE  !$Uݏ=VIafӀRswaX gIMi%-y_B).ߗSO{]֢_]/6cɼ0jh,aflHLvW;=e .h_Yf\$ùD,6<_&CSF! 0[7ݔ3^N^CrސN\PMͱxI:"@"v#wݜ08D,%=dYrO7σRl2X@8<ܗeX8VX% J' o|P"(n#Z=:\WLؙ$baEoJʺ&dLJeFf˹'5%$@5cO$2D ?>_f.GKFKg S1"~rAKtMf%L|6wFdP;Nޓ>>C.^+?RUN0(xwKk2%HV,#I/-h~F1tDS!Nu͒aA_V>]rz*dRT9̺im0Ph,H`v9,QPZ%,BkwTJA*K 0BO!YpDQ;D+n9%5vlB@OQ3 s") a-IF} b D@ύ;3Q,g&h?W߯iC'K-edB(T)ԛbt*8+ci&ChR\~PhiE~C7x#Sㇿ琐"r#[Aԩ} dU^lc =%dqwD2HtQQEZنME)yU%xdg#8kC o_Yb96/751$*s]iJZ*~8E*qMW'h:B\u;dcɨīylD zGZz4釅7C37vJ2{du|ur${7ϗ嵙IXV+U f֪\6d!oDl%!@?W|A$b@D!%kJSùGF5{0Pzg~#1)f~I#)nt=<)'RsZgǏG$ǖU*2ՙPxV4M2x&u^LJFȸկvA?ßMS4O>UMny3M "2hBt` +ྲྀERﴷ ?V~ >éi#MhuqmQlO`!7'oƓ64g*,y,Z0"@Ѡ4)gLWRAp~< p~@NYOX3p<?)!As -%t^Yb &aI*5"[i$tvInjmgdtF-Kc~ټPZ,5LYIT2AFlXx(Vw}Y;)@m5@d1^[K| OI{|b -HlԇM~-Epߣt%@$GAh3@(kgKA3l-=dr Ip&P%@cs%PPǑ(.qZI8!P`ПtcQx%&ԌfB5^&2OSz4_?ͧ*[Tk- Jp]i˗oq) 9W f-hǏb.%ʪwr <nx(pS_X?. S3Y1\'[ \0+=xH3.b!vȇ%MC$ᦧ E;KkbVstyW/ vI/h?] d^ՑJ@tHMGBQ"n3I8:PeEF DATA" wpPXd!XR:Hys1+ {-W=o9a!/ 9@#C∖p E jBX㑥8jjaxߣZ閒@ P\3 FK~i]Zv9/'+k-daXnBOg4  d3,6D4GP@QASh`!Ê0;ϔrCL//1]Dd&RParH8ZCR(>BJJ焳VЩZ<(Pkmykl`9 N9jv~P) Y^c߇x'h!V_1Bvh0B"k5Ie-(0Ye-ETZsi)q:z +Xtւ. {{_5x Sr`6HdyK,/yQky6,aw??\_Cć oީ>rڑiyE~{ #,.QQL|'mȬ2:W.1Z witW/D' ;(T Y> WxXX0v;R؏n#StG}6 g O}3J9!#9.A`,9@7kQ'{Pv "JRY 4)h<;mˀ4"[VpN Sax=T밺C)yRR9HoRK:)=bx2υ F0peT( +4$PJMWkZ j:hiI-I_ ڷ 0ͰD՜ʯdjBP rʂ ICVfTEѫC'gHrbyPp9Pem`"Tpsnk? .`%~m,tW' ,`L`*)7(aGbŞd|tNVa=9vuOM ܷ Zd$qiCza}F[VB2Vq徽VJH:s)0xF!LGOͺ-)L@NK?W,=v^I*X?X_/-^퇶VAm\[GC[D@жNGyjzy1%-^Y qH8i9nqu /FCEp4՘޾9XX73w3ykMl}A,I*E\+?>cͷson'nV)rtӎN1K*ÈkSNeRCvFzWTmB޸ߎwRqpFA>w;cLtZܶw/InMX7"}ɻލWֽgnN3x3}m{ Dքqͱ) d; [.):{$a :nUޭ y&zMQk b(77I^fgf[bN~8`wN&Oc)m-{E[-Uఛ|%L]TClh΅.rX_V+ߦ=)Z WNs0z\ EtR%F(Hy.;֊y_ fF{DUfzSฌXYJ͚^o()0ǽhKfHĥ}-O(QQj4?; }Sw`kpw } )4K痿 ? <TyPyPyPypݢJ)K;% P {NH82BU`LAYYO?f@ooF%۹ -{r. Z¯_e,OM#m~,vKϣQ|-z3 :-^-Z>\Muwⳃ ;{LZ6Xbs G>I$;;x{rޟ]]gςB->9Oa:L[( ?W O,l'OChgM2Ԩ b'ëIR~o8(H{|NW-6=!b+lh$}CTYm_?OA6 9p]ed3WnY'mKݤԃ4V zjJ&CS jTw55OwnhZ rjPN)N5T0,ޥmOזI $`2:FmHu⮉HD"!Jy [O`(8˒Q#$eA- OpRg -P2HꠥoQS^Tx He=B tܸT,n+[Jc.*\8Sr5۪$\llŨqB ,QvSP0&(%Kb9uBs(\PdoԊwkˍZ -+QUIH_hcZY -:rURT1JPCEa.UBˈ+0BÅR4GuؐPbRAA jnu{tTm> y&eS~Zn>GAQǻN̻ew4ջ5a!D)AC$ՇEQњs>e_uKy\ *.T`f8v C&^I*^ ن1R04ne95&˙wե`YGN-jW}\e%d7kDv=U#7G;BqdAO5^!J+i 3IEa- (,=[419xiu|'^ K~_. WTub6i7=lܣR&A9+rsp)[-־`f#'% f Ȏt1q8Qk% AL-NmL-z[-GHUÐUMӔU0\#v1"/Qsmm\'wNvwGg&".w;@:< 0qˬ|x)u [T`һJye9!ʸ,8'6E&}DBRhʥ\6cEj{N+X`*OFߞ|" ͅWӋ߶7/ӥf@T=GŪPUcˢ3 MwDLfx؄~h}jB嘰q xwqd0WW=Oa<3"O+;}q>; d>>Am!<*rt N d'H.qRRMB(G݁!2vn/j':Y(pjZ\o_nA^pg˄y,5ejYU8az Y9ZWhڳ h-{m-L%C㚤/Ywwo9IߏNRΎ6Uq*_@RtG{j) &,A F𦫂9^"CKT}'-+40CI|Gc圢|1>CPՉSזNLA9%hσ#<0Nё1' + c+D2τq]zu‹JX DrNnCY):nnjsby3cmp%/Ř4cqdөE;yrM9OUyr:|@P9JXJW 3]s)&E z'yrrv>u~\n^їyy[,acK(9z|j Zk3z7ggTjHa”-SRjO*|dD%)-4ɼof]{ `1BeJvB;`^i/r~X~?)E!g)v3^c`(K{|X-j_c$w1daAe5O44l[[2ܒtF&m݈rӆ}|AU]:/ :\Xº+jx$\pZZ`(pjQùM5Q oXJJZ5ڈ%v睛cxfذxfzZ]Tckm>%Cdql?oMtiuiBAI0?S]{-CIakr-'={IчQ3Vt'*6T;݃VY3äu~4P5FHߚ_dl[?*\9#Nr r#BLke奕uK Qd Jn ΄.kU/[YRS=T h=tYJB<# IXTlX0 $,|}tŴe%#.!H5~cws|Pe`@5WRK%֣+Ӡ(&@G˨zf h1 :D10Bpjr^>vD(Nb@tsjv3lO 5W\ RvʥAOG2sCPU>ьz ޭ/5WPcMz<) E`/qRScߐZh9:ιmNvԩouףuWnY6 uFnV0CnNnu[Q4jۻew4ջ5a!D)Ō=]|!qU1W>KL\jdO;mGɞ씧uS0s$6썥, `Lch"? uQ muj ʈx rgF[5z?-_%l>7A@gwz$-rfҦUfxHkmLIXT9iQ"G4]9 KwZŐnN:n]fШ01o<ڹ" tTl*|"%bLJ{jkmi %M+[JXWB;] ߿h7ZoT50@kbXPhfުP&i&|OureuNӕEX8~_> VUzd\b]~^OLJ{azGtz7ysv$.:ſ|gYx j}dH"GiHW ~s T'Zߟ3CF{~ּ5k %]VF"+~_}!,m;*VF2>\2MU oG籸Kpk:Fvlr#m?PT҆?BX!)N TIJ*}U0o$Bά*h $SLae(]A$xU1ǔd̋[0t ,߶۷2jYy\kpZ@xMR꬯J[sVդ/v,eVUI0V߈iEq`,1.A,8q)OrUZV'fMqo~ tk\,7au?_N1\xlq!=)r ?7OSH8 lW]Gr0?{}X~^7JUh)QPQўQBm. a-?>])KSFR0k.[WzL, mލ],V^ zn5?28Zj`d Y 5l-qb2I8=g2ȭ #ih1r LAT2e(KOm󃵸"#PXI%Rh%Nv%`F8MNwXXCj*ul]5!0Ey6ny0OF;m^^rX"6%i%'ϟvPR~Jf[ft dtw3ki|j Ek E wH;ΕK5j"7n/d0"`/:ݲi%{sJ(5U{ML 8J?{ܶvI:]UnnI%jlA.ے@Lϯ{f{Ki!n\u׀@!qn ?Lk΄{\ yV"DT >M4"s$łZnSfImkI]试|Bĺw94S!~ŠZi,,lNSdA&f]MPj.LRI&lULT3/*/ ^*p*Vq4/݆vȃtW=MjV5-a<(ǃɯ`E)PK bHI O_ʜUpf͘|=YT{6x&}L)%7x>QM뤊FWׄ;^ E|,L ؚU"`kUx~]E:$q%5WWhh8xjMj˷[!Qe'~ݥcc(,RaOn>{`#e1WĤ[Uw|oޡ|0?5oo(_AFS|58Jia\EuWBPXx(_n1\ zwP C;WO?(1|yRG-{ i2a@%bPr*I&IvwclKq-XoO֌ `;v0%E\k6ʅJJ\qgpfaf7vXlzXAƝz;Z %p( K GxZCW$e2bU٬Wv@X▐';==mX/>9i y`NQ%PjW$| 0 7 exWyPS.iӛE{sPfyHTٳw/\iO`^]QrEy_/ޞss;KݼAQ.xd6d_I;'we]LGHs&&M !Ԡ,!$:%L(ӄ3J*g8aކbOn> MvfЁn"̸E l[6njvE( _>ٖ":jExq h$61$:TL˘gTS0A1`eBIJҝdzpi֋ZDR3Z)iTsɪ[LMP~JMq|8YxL 1aIf54Mɤ17&Jk81 eÀbݾ fSWwwl{<.wHӔ4D&I׻-$*w7h 0 ddhg24މ3q\i/S &Xv]jĺbo55j979҇"%NM^t{|7qYg^{Ͻ3F!LJ ĭ{v{=iM}%B;ͽܐ+-m Od<2l 3ʂtF{{QSŗt^wF{(9t 2ȀHt&ɀKbuTR@\Z>߈%| [ -g֧нذgaFNŚ uʥ[/q]x㴚AwU`r^DwkDn)7n5PK޵S 杵R;҃v :f@XI'n{`ۡkqE^"G+/&.P?+_@")kZ3&R1%k}xh~Z#mbPi:_oWZ12~t.0sYoW%9!X`,{UY7dGzxr u :(_߿t6_Jx[Gݽ?]>o%e4y7<<4#Wl>xgOM]&*sYR(oo1 H8k䂷}r[{N Sqo,43hI XA"(}(Gu~̨ƭt~~h]sTn#;k-=CO_Ջcmˋ9{8oFQ87D8'+XaD8XƲJ̀$7Mn;ZRsA j"Ri"RdHF #AJ8t*S"CX'AIBI W =L*V{WԈv&=vvu0;zH`)F8J8VTix%S;jqҶ:_T3UEyF]jڋ|ٍΐV 컿BL2v.IM$.ɟJ./~%x%wN"D9+KdK0թNt̍PX mIjHV1X8&H L.u |K:(s%¬?(oz.|v(,I1HZIb(e$%B qiqfRDlx=݈TX㹨aS Na`(Q]4* I2W#v>E N %Dp&R2j%T|aXQ&+A,FlRe"X~"F41 PC"ӂaNX |FX TIXQLBJL`gňh b&X h9 NPs.b@\ 4P~7nNc7'.%Rb푤hH*N9y NJ $b/*J nK`ZZ8zjV4vԠ}N@Z\4k5Jwu_mq_LJt\1f9#, v8jb|:1fȗjh=^9/M8F<|n bwXuWyvÎ}C)C`x_Q a5`{1>LK]܂ Sq4O rկyrmiFnȑ Gِ+?wӘQ41ލ&ƽ51}D=DJtcZ jmEI>$)ךFA;SX:4]<:^;:rY?=o;`ё .3 &`6|OlA.'YqCu>=>U!8f&$ k#@P1ISQjM &w9۵UoSzN ݰ♒md x^km#EY,b]Xsa 28ew5DlIv2/K/m%(df,#E;__[w#tVڗ-^4^PC6t~F~9ߏ+"EJ>V0دX)lA mc21'%߿{~U_j%>sή‡X0wN{qr p8 p`8( R`Aռ!:>v-2`*emuAw'vr1ZZfJ,D2ӺU')\JieXm%"dqY 3hsK+Aو+pϙS0I%i{P֣Z}~Ԏ53F[ VUYe=hL&8 Y=Ö z uTׂ6FISEcf]ɜ;SzX9[GNB69yȴ2>^`1F:kicȽR= f5fCUg QU͕ܴXI7 w,գ TEB l:_J:q0[ Dd0֍5k,ºd]*=WJ.l[Y%AO3'qX+؝e&l"%Sa܈dRFY2ykGK-yr6#ѩՑM_;޴ىrv￷e4r@ƎwΞ-6d%Wvy8g\GN郣pF,ib%um!) N>!Uv%>ozT`kng{=ʐ xǛbW|dhutQ˒D(.X{ԒHfǖ*V+" 8k=ـnq@IAKŬW!T:`n4 嗅BMmB#UW>9 _cl9WsTū0҉ >&_W:⯽S,J;(چSB"=[qfLc{STuV{ b"`zqҜLg-u*7|/MM 22 Nu7k' >q:M뜑r@L̥ L鑛cYX/Ib13%GNgOkNs-co*4}>7)=s$qEH,9NQPFmBNMd!*[Βdzj/%+%1wLKWaҥAr˥O'P w\;HQD؅3^oOp5*^$2uA؜dfduӯA-X6ݔ sYt{$Gwg_C(M^I*V#>#o+G ,jb6乍|F&lw~ s5o$gM&]PǥVBpB *%4F{X,9z?;[˜851Xc@n)Ϻǐ ^Y0ޓZznҹ#@3ؘO^̛!Pr+V 1hk>!H԰T[ t#$ drdBX16MaRWϻ&OC!m-Fk)E2ZlFUTy:^"PN*/lRKa运C[TrzC|!$#ojnDC~Keƈ[qDA`9K0s IM~%ݹ3 ޝgGn xw(/QbB5=x G~N[FQi#<=[H\qc Ah-_νI"d)buD%kd̦dLf*gh~A&Ce\A(S2xe4iLJ1+Pj&jwZ+d9VW˫a}|5atfZpI~uԏO>cƃAm Nf~dt 1}Y =w&b٪1g@#lMYl0>Xl|@-3m ОKevgmM qiܹkN9oر'd1ȝk0Zg)Vb&W1)h@@3aI:d ڽ`>Xd4K9R~u՞jjM qzR{M,8,j$X:9M;3k ϛ2&뭀$%^! 9t tƊMܣ*.8Uo0/N|]ff|ZF404!KO&M0=z]=fw:Jo9 Z"(L!2D>ctLr#d\Jq.WT3d5ozLSg5b<(QijP`6b6Ab9 YkȰ:HRTLկ+>"xDR}#Jי_ỷS5 q lMIhA#9n'Ʋǵ.'[w5:.E {KZ3 5p)$Km1{%h3aڥzQ)K|&tpXKL*t1r+AEifHi/BQwó#wUx/HL&ЙD1`,U٪w0\%Ya0B`R:eBhYo\ȌjiȥAUA^z."0 L(%YZ&+)eKl>dDMw<x"iФ!E 2'L֙!E}Dm UQ|;Kqp,&p )^%3 Gy\x yb,yZ`}"bZ2NNTe1 ? ͂5RXAUpF/sI~ zr1TWB Wޛ0A8@LaGz)0\$o*@Ռp> gU*,~kɬZ4/ cH ţFy% : GDX1),^i! &lfl,Ẁ4#IUL*ܳʭ%eS. u9(2gl75>e= tJ%۬ZhW#)T+f8a໘%w__)oLg . Bhw}3A5mHADII>m)1(0& 6dw\F9i $9+[\k5+M^aK.|Z"Ye/)S%y͠jvu*9oiドRKV~J|? C,_~w]ύ_ܲ`i83vf>fOˍ>$GP8Nj+_⺶`ShTd"ƃM:WD|HB$>=_7MFp:!;$^i× N|A&\-?ڞ@u¢ 쑩 :DtHřlu#j7a!q+2*5ni(_LgE1ts} ht1UqDv}B+=Js?<G<ڛW$j$hi!}'&]+EjAqvGPqɕ^W wv׻woRo\t ( R$7m-R3wVGv9'P9WhH ".=X1KY!SM]$WDɩ~ RևmْH9ՍڙȢibw삹DXEx(,(I8݄֠#H'1Xظ߲qfzajyPbYba̔:g}cj5<4J@$TJa$pqSB층X+pƞ@.dV,!YOD`_tz]Y&CxY&<("}[kZcB )qRe.%1w٦>ja5[w$A|&eF ) nV8b@|(@GEߝy?lI} Q$uv%޴d>#dcv!঺Ĉ64b+ѯfҗ\z.^I9O^ "Cc)4T̙dDh #:dϽSi6#,Q .$9R4EL8Aʟv٧>CbBqPC#'\'`24`>' 米;1 ~ɦ0NBD`$fF{ϙ86h61޺RgI@2֦8FXĘG1ilbcE8kIKelb\B 8g l"QrwEDZ pHc`=AI`L"-t[V-Z DexSU [aΠ.)>􃥚\jܠO_D'?o^?gza/ȩmuvm7E'rN ۱P]eٿ^&m|-RMӾYfO`}W{`\\e)G38xxyxT{fTHߙIR7@\r`lx_v8g/NOmkv>gCѨ-{[q9i9y8ǫ;NFm;f`hYFl B,_~k)ִD;l3 VWL-ݽQXAt5Y*sRFr٨dfN8^$f~_!EW_X ~St sg\us=!xǿT<gcl 1x_<;|9M(gcl,el6f˦PΓ8ܻQDHbEM$9RF # Bf$qBc |Jl\Jź\9U} Q:U_xs :E<|[c#LIH\dB#j @„1otub6aglw  Gj5B0"Rx(LB QcG *ªbⱑ!(bR'](BqMHB%sV5DQ+BGE CG8Ԏs6#Np.&~* ^!tnIepB7b+ġ9PlzyZxDs" Q~H&H -w8yyZx F&\" QGc›īK*&^>W!2y!+Z(}yZxos-Oi$a<,TTƬK#PDv  퓫G,(:ZAVrjs<(WojZx'ujY:, "0$pfZ#nH"e"݀eJgN.t,0W@JrU `tY`W"f\Rq*3b4UM6tyⳙT<?9dnDZj2/:݋@p~kr=6&IeḚef7[e-F8SHbD 88E1Cښ8ϝ8Ĝ \Ə `4zvj4x ŋ81T! z;b;bf4 DϗPt]ҜGQtMV Shx=~.#xj,y' ׂ|iϊ &]7+A@'jo< .`YK@PdE (_"W|j KVj+B*AޢXN`5Uj֡_A1JwJh%h 1p D"(E¨E0Ơ+@ Eݩ 9x o/ebZ=RhMzViZt$[-K˺h*5BKD[rhҾx9^/I,붗CZzВX+Jʡ*7(Y/ϧtZt*(OW<37ǗPh8'! 5GfH}¸دQ)'M PĨ0"X!}3,xu8.$i#cHvq@޴b7]A'1x`lSY-fjv̥ӥU!|e"'N[Ȳm^T n҇@*珽E\5dFnKam?,l.j%AZZ~#ߍڻ-K5LȽn`@j'x#/u&L-pn4S^@,-n<.t^)'G/ ñ%Dv#|}O28CU)ΏΊAL_6X`oG:#׿/oaЯg@yPó`ԆO=_[*[\a/_|83 _>,3}+XIE!GGCgi=eq|R류g)YFC ~de@)buGC1@yuڶGo.&{Aw_rn5fjά:Z;_`B}oS J_v$HҬW2]L|H{~?< jzqX!] )۹Y{7%Y*EzoRkOepC֗[lPIo>doo}8H{'`5#/?]o_S]?r@ḿ->^¬~W~i/SLNmS[xoSk~xثgg̩wog9uzG ցٽa1DDoj6v㮯 t{ө4H{qv}wGiM<7׾7yW^ _' B:9n7v}ݿ؝&o L^?hO~1otINmEdlt=t;0TYOQ^'\ݢ8Y ${T}@I/LQQ0qVP7hgo;SXG{|Rvm+վF31yC?:nYɿp&zOE~M>}ևuc/k\A}ŪnȯֳXĊ׾s޴0#'粞O 'w3bc^IH 5Lq*c?/FF+j ٺY*[=~GqtR1#}& n« VX݄T7!MH/Rfa_!zGOGkش ^XM@iDH7\!nbkE7Oj>u4DJ7MVX#.#bpH9KBo)b9CiĿRt":2$~>{i^nֆuOwzW{0& o !~ݝ@6]mVGkR5/;'>w?:<mC2KL.yo 6}.W0nwI$wF)+.<@]wҽov!߅mُqK;yG{4] 'ސo6$6 Nw=nJ,6M0/dH%=HKؾA֣G-uՓ8#>U,Ū:i;}PDD$+`!9 2D= Լ8b{sÏ@z]_ ui !.u%_,?d4 oԐ&LkWvg_ Ř{Ws?{h4J8ZGd3LJZ*Tai%X| !P2O3^.k/;#h [MH74ZUZg[UAa ITZ*ʧJNh JMyÕ4T.ʞ4Vd)S9ʛc 12焔J\ޥᲤ ?OGЗ2:k7FLF6j[m;u3S5̜$޺Ѻ暣R5Ж@8٭cI> XJڂ+*Q/t 8p:iS.ìiSx_R"Pv`ҕl &FiӼF hm8 vW@ g8oqS `:ao@Ataoh ?Hw:)`5*,sC"1*]<8~e_n[R6Iia֙Ep|n eTg] ngeBgy.ݝ? Dfw=[:澞ng2Y$@F_O}+& L.q}n^0;+U=ϊ.8L^E}EzE}\{gkX5*{; poMq}F?k-O!5QcN3"m7w5^=נ|bYg3jA\E1'a܍n󓻻=/p&%V^/ICRbŋ_}?4?Ž$] i]Ez>fek(l^_~L{wNVT73~xM#=#PfwzR^*u|9Q:YY)R%틶-$XwB:ǣ^)[MWLrjWY);a<%-*-KT(bXM&!%t:jݹ?>uP$P[:"gx|1׼ִZ8?d%\wқtǘcxg׮|Z > 8ukEt?^#^nL'/,JJywoPtzݟyЍ|ݘjE0gB@tmڲコq Io^챶Rzɢ;f$òicxΪoä@uo`{)QsVS7;9D@o2F큘i\~k;QWדNI[q]'jWp*KuZ o/*aNxdJBGzםIDiLGUϓ1dGйIB 3]HE o<A>*+XվU@rOʽc l^b8 r@X3LZ>v` >3@l렰H_5;?WViiꇁawvaO@5X}{?> ԏv(-W~}Z탗jy!"X2UMڇC3x5և}g:Tm/Y-LqL[¾Fq0\O[(':'kx^i.MNpHvUͬBݻ zAe{_rx~L{Lwk*u6kMQ6Cʘ_o&!*?>: SLR" ˔/PAFE:`>(g؉i/pJH >pNQQ#qQC Ê:DԂa¸ܵǰJKM; hT#%YRB$rgzͣJaaSZkj\*R݃͊1bW ?83ӫc4ƹP͌؋5XcZN'(Biǝg=i@VU2brc+Y};*v^վZ~і:HKϵ̭cknkNfӜs=zpYq`Oy=0k]~J4\t;wެ y<=v_o sdHYnp;kWpn?NI{:6Ч*Ftv6\GG5kHa){v)ݦ3[\z:!3BJf6ˬdF ,1(;81D&0m+@$aZOxLaq#NVQ_ A=he &bn/|vSYXɝka]G!'e{hYz wnb(PpS:͓ysLKXˡ߰JJZOA4bai:m -ku<ôxPSC j, Ƚp4{MST3ou>B tˊ VMRBBs09(#Zbڞ˰ L|cVs$E!JTx;jZuzk5{$Q\^&BUDS=EV]a %m8ꢈCiۂLH vJ{Qh B`SIDO. QBE,!qJ壉z q?STZLB#_}V0_?_iFvwjӎ^~Y!GXO}93Ųe=qw^_?~&y?/H#.-FeQdD<,R$[z /./c> RȖ(~۳ŏŏG^GBE]NPF;:eJI?Sc.)~||b(~N/SErI\$'qQvפ94?s\DP=IP,#q@ . h=*ӄl!dU4[ Tϳ">DcC 9^,sc< (ri$q͈J΍HC&󩣘f 1a*5HTݮwMn( Xpcުw͆*JYYDNpXTAV[%Z2}6);0JKl1aZ,6^) YX,5}R.+^V*6jnhZ $f>iHaA9a3M)rqܸ o@A~`Ԅ݆7MѺEyw̦u2`@ajkM*)d9qe_pa'd$w!0p~1K{&ng?pfBJtva*Z^gˉ;ZS Smc ֛;34&b17bJ!Z 妰XBG mj=؍;mcaPB*96i𖞻h9x!F@ x4qg_K4ap:C#0t#rn0 9 H&c*W8xw}m…}WCP_ԿK=_ BA9w/=sq$)E׹ _"^-+ Bwj!(f`Ăi" E")(wcΌx>XhĤM\JCRFF+$Q>D*= N/.=tYۉe=u^ezp_>Wu nQOu=8O;K ].Ϭwx~?:"psw;9܃{sOf\r]'&{t JIJԻ>8Yt^+74UO j}k\~#-ԧWMKhqV'58iʘD%296gТ ɊxP )Ea$ul01|k9,% BFH\.`&aـZ&i6 )ݜc*/`KKva;‘`n'֒7`7|[Q^>>~m R$x.lrLc8oX t{9&:kwBKSn:^vϛM<]151Q*+ZQA S%42Fh0(A}?^?/#xoDooveOIko}ףl:z`sG,CewwZ5wSF[bڃrP*iAjy!"X2U(=&Kpk^A`˅6(d*""i$޵F#_0`N$b0ֆer**IUTj`F=&`'H؊y( ka?zXN'Q͹1sn̜3bṇi31SΏy\0Cfwȑ9,Y2/y2Mos\-l9[``̎*3!cRcΙ9g |sf朙9g&u\|jǬfp2Ptn㽐^u6 P*%!  TAt 1@rXe$_VZ*YI!rԠ@)\I$}Qgf*i˯wvTw`|,ZoF ɘ{)^^.cbwH eXyo`VR6Aŗ6޺P3,]-c'_kX{[0']πOY:":\ :(EI!nm[5sq Ur=/ItFAXHwf]b1)ۜ,3 5%[}VU-U%H4f_B^NbChQKwJwޒuEG'4”!fivJ*В/랑 ,^$钵ŚjqFhalxuCXlu{b^ f6@۔LO@Vi- &st(4“7TDQ[ND)ICvBMq 1vk^c/=LR}9}ꢘyJȍJ;TxC۝shzՊpjJ$Q_GhuZnA^GѴ!EμȚ́aL x\q6z:=S !jAǁ\d !i.DN`j@V\cqHnOJ9p`/^s{ֆ+N\lh3`khƚlV|xM X xn+V2mevL$<7{Wdml! I,1< &q@dz&%jCJ ֧۝EhGo-]dnnyNKyjY Wp*jE @IaEIDա>e~ 93yo҈˙;qg"DܺQ/L܍g*LŝS>/s' };q;8wXow?rOrwP3# #wwMBgNGn\&ΤܑI;LʝI3)&mX`\6X8Yș5q*eȕWYLTh 0raz MxtQ1<?r*W2̠)({":0Oxw1c0(:mv\pL`ZY5C/#$dw§JF!T0ZC@ pDqΒ(҆Z+;B2Lݠ>>2gyrAqvBTش7N;=Hnmo04o<渷 <֪`Juz~AxgeIH$ Ҷǵ7Eeq@A+ o_ғX;vA_{(4Vc` (ܓ$_EmK6]i)s4O2I*dI,BBÜ@e +5`T.yG%;N'N􅰇(L=r XWvUL>4uILSRD>cNX $%U*GnR]B 5YkE6'Z@DilΩrȢ`B$>Sn!_hʤsǏB%Ϸww vrU?z!9}=Ufd;z Fb^P"f)I]1XtVH4*Ƚ5鸫odc՚e=Fm(>Q:r{1<aprQe tim\Yvӓ?o| 2wy8x{2n{et>tB@|i֊s ̀^r^SxaՖvc_SF" ǒK9>Ŕ1G ˇjS/تUd0߄v*/z)׏ f ם>>J~Itk˕xypkVgOc-V=O5?C}l6%n;aBZa'^UvI{Vjizz)?k7Y)vpj SsGċOb1]['7U4eP)00Xfcajln}[P2z#Ģ7N֚CT~=ڀ.?:8R $g{h qбǏڹ7Fm):%ЁZ]@I kUi2P Kǔ!7L4 Ib( 0 2֘2uE.aE.&]_2M`%h')l0~fU5hPp4<+S~' '1ۼ~ۼ7M.{i*v}U]Zbtx ㅨ_vo/>yzU*x4^*w')#ߝ 7ߚ}=nk9՛ΗVM H+ן'o9d5(g .^d kd $`C焴ףOaRD2NGO:43PvJ3z镹>}#}Rs/5竹Y 5O_=ܞ~Y\![xlxpa+!#\7^ӺUq.-|ZCxgk[ D-2/5 Z;RK@K-MxT-^h[ _,,8<{9m9!|Z/zb=@@ʁI}]ɊWG7)K -Wε^X&kQȤuX'|ZY3H\E!nѥ;!|ZvFU)&y2UT>;ERhӋu`D %^V5J2#,-Tϧ^K{`w=]Zh0ehQ;@xgocJ u2 jJNQ]Z/.H(Uv]X ,Γ"wwhԅh,Q'jSfXTLD6К]Zh!<( OʰJ5$hIc ߥ )wPD LLkyڝa=.|Z/a/=pvu {OeތMJTӇV QT`4*f)BYVH 3XR_N^2']u)76}L6}'u&^|f_oSNn;ihMkÜb=0VPd(; jIcKK ,p)ƺ쓘"ήlq1v pTz&Nqrkԭ,up,3l)53۷;;Qr8~./⫭Q'_=a7j`/K޷]72r@z Gv #m}OiOrh: h޷(x'vhBWHbd'z!ړ(G睾ɠDsNIp$p=,#Ȁ==1Ms#l9|vxS'u48d-^P{,y+9K7yqVANtC' {(Bafa9[Q\DKDmsys^ `|ɀµ!#2EJ\KjS'O?,iۊ_mY۴]%ûY'49>j(9q}ԃæEɱ% `0\(CΈ!?] M.&qi['*B07 m ]{; 13at^ +;}F\ƯF]GboZXR*Hi2P95Q^RNI.SORa!)!) /ǩ a8Z۩ݑo$ʮH'}Zf~@)"e,,x.˿] @fB4p4lfitv7.7=K"9`IͧOu@k 5 $]^tVb2X A0FgiH|d &w+RCnx &"u th>vHkRMK.yMoBڰT}%QB}CiB#FFƜ1 "5T!mU_#1kf5LZ(KDmˢb Mn-n+$eH}us4V\%a)C!bar -2>%sTT XJdABDBIøLj06PZIiFC`(dU8GT3e[uhks78s15'ݙo{3݂z]*RڪTF/f6inkq4- < Yۋ>|菿sw%מyKF}z7rǗn]]ec\ks??<ܿaeJl5d|Bqջ6;WV_am:Vsp]۾[b뽹 ZΕ{nNN%7u lϠs{Ȭ,f|:V (gGG,G`DzCkoԚkbuTwY㋣3o]qIT eARpj";kz;ދdtɋy\wq<%z^~Wk*>^xkiDwzG?{'o+MWL+ͦU$l߆ H_P`\kc93*S]%u>KK:.ߘA=J*6ySXՏ5?vf_'??x.um4^KBU,\+naۯ-<߅C6dgolw KWQ7_gz.W{y`8(5ӣvTψ߿J~sas {8H݃ f _F@Ȥ&RH/bV ׻xS,{WIq 99Ш\ڑQzIciv;H+ ֑L6lV.yl]/UiA, n^ ?~v(FQ%)DPf'Uw#ǹo jWB߿_{>}cqS"f>B$w674[xbH=`~It#:RO۬[!}ݎ]{JZFKotLopȚޡRAPwh 6|׷lg^Tz:ʯdjom"yZ!O14&Y+aX۬|ZB|X F# U$BI!-*`G;Hl@%2a&q])RJQPs'ļp@Kٷh \@C,M,Ǽ>,I+{"wpVfL6^3PHDk~i⛐gd:m \Th0j>˨qص"u®+>v9ӀA>{~QUS/\XF8Bq.^9ӠX2AB$޾'S!͐c>vmolݳ؋jjfW*(?gWF)ZߘNbH#ۋ;'ra>+=R싴7Q .gdԯ3i.:و_W &z6 f&G!sƵ+H 9b:@iJ(ahpCeq D߻[~~_Lyv\(jX!D]+.h+ '6p8;!(s6mb6ͲGb;XOS󟀕AJr[|l Z#.ADE]X1iY/]jg9)5B_ieq+t5/{Ţ#4 'NׇMg7nx, "EH! ġ0:F*Ø" 41VFP&Pp] br5Wq91VR`@1GqcXІHp hoҘbZx=;_hR&7N0>]KwyQXz|W*|>P-k'N3K\/qAR|RKOd/۶m8s o4:.Hͦz!Vɐbwf͕,c)n$7j$+ra*`E#krfυ𳰻_3983J`Bl=:ښnTJ c#_ |mty`^װ&wr!VpvaKh, -ʧ&>"&zƌB4c!fsV| qIۃ ɥج)47<Jf]vxDY >-`2M )Ha}$f,†upsdPVE<$Wlc60ڃ bWqW9_UgSjZ+M=Z[)NP zZ]Xz GrgsY(.+^)#5Vt- b.*s]3O>-Q֤dhvrW(ݶHXeG YS^]'6/p.pygSy"5v$A4X"+Q'R }`w<| 7l}nn((MS7i/7JLY~#^8N ݮgTDԑ7Lwr⑻tJ J}:)̭,5teM1Gd-XaP L}K`6nƾ @@XP,= uiKNOB 4hq@#MC#FLX RRPbCP!a9h.Tp D>~Qa'+R|Zjr$r6)$ga$x뤈e1;zsS1Fr!u 0ձ KtE<82 B +X|% ',6ad,lJohK$>`Jk!L3>o" LbŐJ Qx0dGTYcaDF"L$0UU@) _wۅEG :}6EP{JƱ`a\BT`;1o__ɬO4yɡcBI43 $&RIZX5 E$P.Ou製M-:NJu$oWT'N2\iv ?Q61׿AF*A* qK!HVrYxȍc4,Draj1uuamgbPɇPp' kb+`N0hVC. 0?$BA*(C ˼5$-,awnI9N–,? 0Xx (Grtl%9_uds􊠽:JP,9+@mXaaP)R؝"qaLcX,z؝qA4JKԵie>%Qv[ ՔJӞ駛: MXM,mH,7'Y"YWd')ɥb `,XQ߯1H$!gȗ(U19F_F+g@iFaǴ0Z`*KPꉚsrՉxt多rF>& tWvlUUGcKzD >tVTROm+-)<*%( i&H 5E9+)5,؜6<4K(clۇ}~ 0FvUvᅏZǗ: ywHկo{kKB_gt{w)}a^?[9z'c\+d.Lpz&QJu'3)Zn2y4I0X_G:; &h,6oOaw%,4ݨ߇;;yyYuE˾X3X}N⧳9~ΏjWUÃh^@& mw]]/EEţ(`1_E" 49;Do8Jg>߷/vXxzj6]7wi>.GNojх:5PU Gn](nw[.ɸW4) ;G](u%0l;Df5NϏ(v4Öt uFo /ha@rԛlP,o|~t to0e,'gmgS>}ɳ{#3g˓EOyO2#;>&:ݮF<aOѓu`\0?Dj7m|:}sP΢g7yݯzha1":y_Fj v)ݙp1KIʚurM5ɇJ~գ7?}lK黋L(3HFYIx_n+֠6hb--=ȲN_~> ,L/ˆΖlw",R&+nFɟaO1ی^+lV o ?q7w^axpqsO?hv2`3cYf\O9DpC EsZSw>3<ig}7ʠ9t>\59n]ߎ[M2,y&{^_=?Oy~ u&O:?;YPΣ9٬Ÿq32z,Gz@: AOG3ݸP0[ Ɋ`~9gV;tLGǭԻkοew>>WOv]Iy5j_uTn>O_vT?a* o7I!̺^ټ귮Zm P ,R[aiJ]3 hwe4@.V G {OƮsCo`;$w,tuen~27\wy3'v3t> ܃I5Յ+aI;9o͂Gӵ#`It3C76va=x燅WIM.\aFRK%IJ! i`i,1\鄦~Ɖo $9ԍn#7ZhoudjG$f|Ck7HD%۾!gc\[L9me͇gRy,1E& d-bB ,cL h'É~uF3`jK^F Taw r9,ɣz79LG9+Z7d?6nrE }HamGx2< / 'P^LR0HF0 \Gdn4toc vZp=>/.KEK}U#P]]x=#!眲02IJP0>^4mI0MϽDXOB\0 |{-fo* DVKH k۷N<烶ZpS-PgJ9>u2vӘmġX 7dnp_E,.:wљ6І&iS8 Z LĦB$8!R'5L%ثl^{U@wYc\ jLAhQjإy+CMhC lC>b*vkHLΪBǗtJzìk!ȅ(dElP[2PG1:\ev\BVh(+rNs8GT{)(bPB$>:K/V DԎIT9a*AK.`ڠFOLқ8ʼ^05}%Dk$VPɰWB/%cʧ6LaxԤ.pS6}* 椾m V8KLnxXV_U&J2ᓈYp1G4oP&#W}{h_]ҥ<]@o2xnG󇺓~3t-tE1oq* ,e֥2HɵM`i͖*h2Xz$bS8RH @i7XlY޿mRDPf D)06Vusp͵o&K+U؆nֳ 9KyR48pHĹ:&}!o [rKGpVjv C#ĥfX.s]n-ydNV2buisdJnF4GׄE7pQA~!w1WP^.^8rh~*E{T`d+ڶ5}ۂWcd6$ Om[ཝKS-S}1iTjYM1c;E\y7r\"~0}[KH˜J1Ŏ%&\9*e2uYN%b0f F&1p,2VGA$4D_:>Ycs,䬜 :#UP ȒXtvꦘ _pmJe䧵P39蝞 B~#KǕ {鵁CX0 %܈L룀,6Y$1C-DB"Hc] 2Ճd`Ey*RM P؆MQI#y$ljG6eD'YpVԥAG8aaSu,A|ČkM,4M QCy.(׼J ZSI{԰Q^H,ueƛjͺ~s\:"FR s*HFc-)^. F9f8 qEԧ,ܒ'D,MRC$ME.rގM2."4?h{F]dՌFI_ л ЅTb R3k!B(Cb1<i8zZ+?M^>yaRLÇ$P!?4{c Y gTy#az&"NU H"|<-fm< ~-c$qGK((~$sz8F 4} ?}[:藑"U'_ҽ/C-=y 68;QBI-ILp<%zc y%@ O kآ"b 2=1yYݓaWK ZZK'%Φ6u/CY<>ow1Yk̭#)MDuL^1R7K,@a]] s6+\miw̘xzucY7ε$ Fד~Rڒ,I;Dx|88yR֐t@GBH:52xF#} b:p%ȱAH ,!!P 9W9TAmɌ=)[3[ق  ~ x rRF!Ą3)qO#m($bČfꦯly.`2T LH 9=rQ$\ۃDE`3>;/8bǜ@Ƀ SX6r89*$-4Օ~AR E{r"c[‹Ȱ-6S7`"rA-m,.u]r)/!'|zUu6ԑ\xEȩFu|h-N2PFTPДU BRmblZ4 Vk}ל2$J'nWK"-R`EʒZ[(%g5haSD(ƉXΟ'W9!wơYjC6*UCŮ.S`Gvii0vaWYQ m`t'8]?>MKS ʺj!Zl3>JK :jPgXC$ v!)@ ,|"[З2+s0'tK=#" UR#=J#AUU[R *i`}j}Jӡ3q(.A9 UdKUx__E>!Ŋ$x"ǫ $x=/vX~0Qn',q x]: b `GNil}BRZ($Ծ4 ĜzԖXXFZiBbٸz>^J'nԤ^(@Qy_W\0k^A5F7kiV}xb%cݛjP'Bt;KVxUx͙f V4ֹ'TVz+bˡc[;3!iܜEx0yVہfð/뵷i)psQx"hFvxߛ~҇\koiͳ<nȗ"Miܽqt[1O0Uf@C۷j槶6eҚ׫E8Ú7'ȉi`'퍒6(1~L\lj1〨ygJwub{X==Vmv @HN+!l%YqB>E& ;)Hv]Pr5hZ@&vyk ݰL #]toϐbd8#7g>ͮ~ Rp 3K%[# s+Zpe< z臝OK/VR,Vt kO84աwsNK_*6b;~BZZ+j|ina6 U(&>y~x1,uCޗ)"/).{:\І&գI.T qkIs&cj<~SMo:'Cs@i#{O{DMZ`l-K.œSkf-s(Խs"ٞgE!l?"9&I\$s`$ZMEIW@&tw+ uA20rf&tv w4R>CT6x~=n3O Ì\".X@ t^GM*^^""g$]Pm9rhc%@nY"T@Izūb뢖8#iM5g6Wcٮ ^Idލwyq7Z4y[ȟ:@oiՐsb r *y: #"JBEE! -m,!ר#Vjcj2-i0_|+.--԰CsU!= Zm5Ħ[VvC 5x}/2h-mrҖZ[)m]-!l{6`u$h!A0h#6ml&:Lx`yӻ|4y`, 2-/daS wm/L~x?&U1g١wJ 0b'm}9$kNG 6 fvxEf}Z%=%{I/WsNe>}BMez~>^CWfs3}ƣ:񦞄Gh}uQ:B>^nYxl?uR,gYR(Zk !vLt$7=$5a:ԟ._];9>S󢟈z{֣nFdS< Xn~܏wVOLDL㝜Uk GV4QvcbŃㅩ5Ko%3|if9yyqyz|}j]\ZϏ_v,6-DLKӞ5oM5殬_ /2> _^ZQd6NYN%et%G~zuqmLg,xPYVodM>0˳_g_=feſ=iٲ>>?^_+C^߭zB6KMRk̺?mH)EeU>2sՕdRiQey(Yr:Cic.?} UM^*s+uG%j˽w;C+q߻4kzt W丽{VD_lKR=_ߓl/bn;~ڧHO,<& ~;*"/^kzx}̆ P>N|p.*Ԧ~>EFGq$}V9iK?bލ:7&P^#yHEF"jW\4My oGXεP9W9ד6 g٧l'f'۫R.>KYSrL X &rN+a2ӯ JxʓZrh(Ex2M>\M%1' г׿?IraOsy2JkFZb{ո7Ŀ/jtr@,O0enɶƮ8`0vH)<( Z/J83&WnS3GnrBBZF-tht"a$FmG7fVZdz"N~C!1鵈gfE(pwk z8F؜|^ށO[>J?oF]y2j`)627w)AH\e>kԶ/*lXy(&Ԡŷ1+m#6Wעx4g#shx$yv2{7M-W7l&tas=I^l$l9'Skd^h)j?IHRe4T+5۵ c)Z-51ڵMɍyoԆxUKTsk*kjb0j{bg:b 'lv-q+Ӆ &]^F7)`oNV}PEdDȕ!%  BG09ҹ$VG "/Bw͍F%]t/7+'/IʅW[VRD:94DqH`vy @;9B6HEzk!yQAJEڙ^@1O=0`Qh;{ۊ m4خĦ)P󔘴ؤ>j@ (: dNGḹ(e5e<"'E91{%NǺ=ŮS8_s+mfPk)؋Mʾ'/%xk~WB"A`pRс1*^)A1ڄI(y=Xb1r"p}=X߉*jvx< ] j~AkxK@KbJQFQb..D } Oh v_X^}?\}z08j&llp`5qdD%@v  'ɣߐ8rJmf!HYCP+F0IH(Dd4.qdFASjӌe<]}l eՠUإ6:]v&vwa'C8Y$A%% h-{hu*0=_&lFa)1M6#'ys ~RW8& Dad)y,Abxdj& Vԅgd5YCK'rq/-QZ6G/_,VȔ&y,ɓd1JVDWEk] V8g35:N8a}yr6=xx-8DC2T?=,ņ#6v<ɒc/7o& cv۳roNrbμڎ:l$c` yb@<<^_Fm EbcưXkx}K, DX1`eb'"9UA )Z˓{ȫA[ފPӁ(N+(XF@BFSYD#y 9Fj:c1w;a&u$x>ňEpô~qRt.<7˞{óQo{jX۾S)7W.˫R'AYQ:6);ˆYc¿y75X` &C(Kڛis ! xzF mQW@AAh ՀP; ][ [Q&3iP}~AWKvXN?*c' XPM-VNHO] ]=Gv>ݢ/ҫQF$fM !hI#w<; 9<+gtPDo {cm΋+^<ߋ[Zm*>tnULA)wWn).d@)2rl@[Y-9RH:Kic CzD\vP1TϹoRm$O"<`o?\] 1O?]F9UyLͦہ(+k!r)O~wyޭ1S}XK(H4mc0L:; F5V`jc9ᵱW ʺS71muhS׉: -qyd^rҠt\9]xI8F>]lf:@" ~ Okw+E#ݍOmCЩg)f=SrS6}CcDZ' bx~ĮK-XL5.>i'\iC2gH]z{ww%[蘻u/TeNUw['5GRx*MTlb#Z+ÇfWGoIYoʄ0snv_b7$!mfה9H/_z)Ymp0n=GKj6䲭Y붱hw?t-5zikpO7nݳ.g!){f;1'*NBju?9hIt@Nѧ5 ݣg:<;՛v76  -8L0^L0jVv9$y6fNLi[i4|1ɀ9zNH'}C B梄B:, -Ŧ( EB)JR:PݿG}SE(0lu2: ӬG+5Yo>d;=˕Ӷкe]"P6q|IdMB}5cwM چ.kR $5oV2kԤ3Η4)$ > iB(f2e*fr5%e.Y)EJL1$(hByI+BQa*9ef r  !OK䅪а'/3*kNUZA;=F*:Ntz1\ǽ[3)I={9ID#䄷 oy4 g" v#A|?)#Mzpewh5v|= -8e-ّEjZ=G!|E] grΫ_z7]!,fiNt1i(C{:Xy#c))_=ZՖw?q_Ɏ՚m!d/\?a./qkcUѵU\Cow|v2^w*Td2/q(1$~fէyՒW?Zi{ñzKu_>;x_-/ޜj_wB =!`A~'هw25&:MA_ }߾^۳7~|ZܛTO8~y뜪=.j;ym-g/u2a/?ÏzFD`cAV]GEݟgMC%hdl@`Ut3:'vfYogmcǻOz ,<8%}a(u娇^iXqj]U.[\*kfv˜Gg34|yM ClPqVBU93yǻLytv3y< ɣGtOh[olR0 7"%M`{Ggm t + 3hdY,QbB&;DY'<~Iٕr8e5TI1q#)bJLhxc*0ť\iQp-Q-s2LzТ6=}BѨbgy0ktg0)!o2HZppGc=毹\CP/ #K)c JSr`ћ\"d.Zu, KJ9 K'* Ӳe!jMR\IAI(+?c }>[ -ծ/hwz FD;Z tӫ}> g۽Lѱ [hȯ\EtJOC8Vթ}>󵳭ftwk!rҩA꽼uh W\BR$ dIYcc%l(Ef(lD,AA$1ɐAȬ2xbI Dt* -νu$cr@L,4:(G2QY0p8}z eAI6^PcZIt:HEgw jIF<9fG?ݻ.볫-t{v4zy՛Oyxs7qT7wa&hͼ'Yr:sLa6Xv̏=-sZɼ l2 :OYt;venjЀyWnͼZ!;!mO@ {9#`^A{6;8~ yGXm.Yg3B 0 ވ.h-;%< 2% ~ڏZAkN<= R(9ꋄv0?fe[!py6)uVp6r 60apwlz@5cF] *E8 ߔɄ k@.{Yd#Fh!2]k7 v y3Z'H]j-wJ|2JyXnɂ.X(!"$/3?f 7b~Ϳ16q$Z-/Un?o|yRg//iK?"?[`۫޷D&Sںhd=d%ǷEDQHxeߦBgҫs'bp/x 3٢X$d2^JJ&lWC=.R*BBFq.{},JmG>ؽ|ro"TCkG| Sfg_E툧$U`$*OElL9qtQ8>:*wmL;]o#7WRcG4IfL[h[r$s GI:,Xj1 iMU=ޮ6I:Is1ؤJJDx{'A'-2Gj$V62U5zqRpIJ)ZoVS)[祗*%0,exx PlLK5RD x'>{bejyGK5B#X2p@ e;!G; *jً9 L) QlR NZiU05±V V:0iYYu֋,69v4c|l.ܶeZFxw@(%uRB_(apQ:w9k؀mcr_yhPr,jO(C' h3(u%05tc@b޻^ }{@Y^غ{1g'ŀ%9]G2xnOnz7YOqlZ jte mG6s!\SuCo 3r~\.vo"2~K^j8GAB'+F F`ES&..od)rKab~+꘧;l11pymc܅}Eo8Fŗ$і;`\3N̹6]Mz]k^?9xwk}xxq5}t}vyc%G9'px;mvmtm$ >$`ox K9sA[l)iRشD-W{{pr!d^eձ)oyf[clۣ->00 ,w6ێ2--z˴o/pgonYZ-ϒi}6Fzk%7ߌ.11Tss8ݣ QrenKn0u"\=]ݸ6n 0kS"D w+eۣ:S_8]1"UsM 7% n%-]'=. 4Q*Ӽ}#{H+B.u%UB|:L] G1fu32SӃG W*2fWs Lt:zDd+3]Xx{kugVW?]CX68Ur@E6+ `R?ӫ(&<jNVQp$r)©2=HEԮt\([7[j5Т|UsE!a r͢p|˽b-̢Pv9=jeQLd>&UKA BEYMs<`I fF:tEznŸ'̾zmgO8GS_]?^"E4sSr OӑDz\XrDR60$&1E,*x|\=n (0R){@|<4z!"jsQPEF}4@Hs죋4 2cL!юahB)5X/ԳObAEPj*ը!XXѦ;, ؔ{Y' 4FE<3SyŃwOng/-O=2|5Y317Ng2b2]Ntv|rջ<'~MW76`<7WEeJk$p) H> 87X J^9B:q)|5mdan=b*~bTK<8j`1| cz]1$KFJ{1f9B]w]tν y{DiD_*V1O7}nɚSNM1o_Lo??a:.?ňcz)6gE6k (|#dxnfAY` :au`lac@`镊Wٿ27( rWk~3sA_lԁz/7󍴢y~bq?y. b/nJ}<5~F4=f}]XNga3XrYZ/ݸ?v"_qOʙOH:eIX=j漥VRqFVc&8p ^sV_kqyK ^'D=pTy,@Gp'3$W\1' lQ #A!:廼 XN^/): 2کdpJFvA݆XJ S!g-H:J bzW2g;YFMS~d9\Xsekʖ78fgAw巘W8 7.YY)-N^gU-5s#Ph C0 !&;K1h'DI XQl^ JFm exK$ Ӊ ؒUk"DV蜵(4c\'A@Rә떹I֪F@0fm4p,g8˸w(0 XϾD%; V5c\1 [Nl>˚۪=<) T [ 6~!ƌ Ep#NZoIv0M.AۍB]~k)D%%x8=u3".* 2e@^yad14ȢNyGg ;ϹW-`X(UqVHͩwXa+P1ݺoSg@ nʐAc5Wqp KT \G]e%0=OrN ';عNVc. -XR u4M*3< O:Ie` 4~%I% wq_tbizj#CXj!`MbqQ { ^r񪴵TsJGj5u^z :,e3/[˚Vx-Zb$ !TQ5V{@yjm)|0WҐ6c4q;0%OԮ?1zR}۟j±yB;~XY3r~ M@n TF,( 5ш5cRB&v7ES)6|YbY>wXD=$;flI ޣnj1ec{7gQ&Qq1ko^1b J=\qH YV"DzVKG1jȕ_Dx!M:#zSkost'ON>CLj5r;:/-sk\+m3:1T8ʹqWWދ"^(s9UŜFH uatb>ʬzb|uxix&)co_৫nէ77s}Z"Ň_.W9VwHa ~|죋wfɱz~Y૬al{h?lW_mb7g׭_uX,aVȿZSdOa[.)K'H ^O&ݲ'*݆X!K4kMi3񨥸J9=ZjQAnA1SUrhXde\p`huo  `k:|e*B {7G7:(<*1Z#= CRiiA %H)BXEv9v4cb6,j /WXvXֵ:+/91a7E1!|UﶋC*fBń TLYBǢ(Fuˡ!@oX}8 *r#3N-%DrG(桥J*?Ycc/)spނ=gA[m-$ %Ӗ2Iv=BDZ$bu_#FL3wG]m۟Tnṋ~pq=ܞh-Zn/~OgO0N lflkR i0CXX:v.0csA-$7"CÇ!fO.ig!0{ + 19b!6^EpVKĨӈXG^Dwn0w|ä́e1?VhtLx|?{\<bgV۩7UDSF8ΝCy)L|Zc'J'!GK|Jŕ'Px|]&i2B )0IUG2B cia{#>e JkoGO&!Pʚ+Bu5(Jp>e>Y"c<'ZY$&$W1)!`ČnEIx.U:CUsxE:~ RNyG$t ^>.c 5 ~<:t-hWͿ.n/3SR?\m.տw? W5F.¯8#{N`H n-%T%Hû͵6KN][ϐ{T[d쿗݉5 ,G`/G0@e9B+w摄4! ͐OXV?$ 3Ǣ!j k妬}Hwsa ċƖpJKrphL 'ų'Y `3d.PիŎQz]/ $ݽ?.Iy9;IJQ G&* RwNT%sfD)UDOOTfԳ?U1|=;& a&p7Ɠa篴Wé2EW <$qW5#翫XaQS2mۏ%;Eº~x,UB܍/|r\m|@'L2 }{u{ s\g3`¨vXź!++J$m,moc9c?q%w1Ȣ?횓9ߧ~8z8s9{\}]y[_߮VWCΨ^ɉ[ Ŀ7\{|\>}{w_opws Ϳ[[t8!ō)qku&0Aff5Ru^~3Woṋ~F$EpfnyK X=O~OgO0N;lflΛ2VFp# !-1VHdKN 4R M~~zt[WB;',+Jg@s?<1"-%\!Ɨu ,՝f!9usT@[j qUL*4 8(!z&D`bh{8d_?+g1uYb샢HQ!EUhl2< ʞo>i׋<Y!KWJݝQ` H:*D%Tw%$XB :84^fq^N%U5@FZKweZؠMI`aR 6"%@Jqᘊ50Ҏ5Dw}R0a JfBStWƐJ:k|v[ -H*5EppJ"tA8H;ômZDva%|-r2BkTh\cħPx&-L|Z1SF#^'?"Iej/w l.Z0&kf,^b-||kĻV2k#a6ƶ6H!e>%~"t zTWy^*؋FY%%|,lHz+(:71Ƥθ}uh+j[ƓeZ*0ccDx36OosR{uHz85hRF8(MeMμT0•dKbخ˴'g f" `^X+o /GU\Jr&)dI?fD]Xss3ows!_a{܁pqETg܍ĤuɴU$9bBXl:7UixwxNŽYty4<;~knVaiɋBeF#,a X8¢Vmc"HBFi6O'B`oeg\EzfWlTM؀vsV xLp-%-F\FZO[OZτ7t.1`ybƊ޴SLC8RnutX}D8lj,ZX؟wW c% OGCZ̨bic:Y+lBS[ɔ\DdJo{ڍ "r1H9hMi'uv?$jhRc R O'8JRBg]sT !n+{{ \I;U]BAJSe *# ETrq\aLj.+"J u j~jl% tbe/N t_:!N]7v^?*d8ɹ',+r:eX⡘ERyMYCK?Z{M fҽߗ |:-d6Du┦J4HLdVd܈ *#=7˳*Fa]l/ R9ԁFry⅔DW+}kkҗw&<{O S~)o^yÁjUV8*uɰ*VOZ[knܘ4XG]4r|z${X4䪞$Gw yeZ׸"_iڶW}E&>V{ ƢqwQ츻XkI&[+OO.O#,PMwTMgQSq9KN.o'w3m'wrO y "T=WVെP T YWTMhrut3eu5,:>Hyzr^F8J͓%6߾US=xIɪk#urjb6N 0FtM+ I4jnkif"-c2ީ #ՌYud0f-s^hŴ0ScS+B8\Y4A;U2pz|.re%)V_ȑ&9/@n}Bq.ʽoF`֏ͅ,4¶46 fypcZ` zIMҖ #+es;g Nj1W o% :첻5(1D l[5^J& u8Cc07xR7O$~]5=! VaCiΧѥv9|{&f E4K\~8fr -΀泱"O;-TVBBs͒)oqk7xr1H9h9O`ɴ[6ڭ|":ZEx)cГnBB<Ӣc.Լ_ _5:fǖ2D)5] cM{<'ϯn sBnrрI2Yjg%Cˈ,wƓ0̱hI`n\9t )c.RIyWomp[Z6-bٞI,˦杋eg-ӽ|;96!EY^%M{GfbݷUN02uM<=9/czjkgg1m{v޿Hȿ؍>Σ@"P;j֑t@U?WSkT,t 깭:=;w("2ԤF{$ٻ6lWzEJ} $3uyH VE*ͦH-m23]u1rOF9VFq\M4neݏۥ$H^jq!1:xMeJ?!.DA[Ek90≶ C՘$S3' 昊)q@3{AE'Aq4F*U.\E "fȱ*j!j4)_\E4H1F1Ģ'2अM$aKkgSrBy88ɒ”4eAQbQJ1I%H26 y\GCZ' 8:)p FZ@5%w 4HJZkJ>OH45L"O4G\R% J2%?RC$ 񂥠$$B jm !1 X#6(eG6o3G"S| 8̂ CX2LC)S M,D  5V ;G"N"  I\5aqYPSw'uF洏BdN`[N|H~<Պ5CF)אs@u7IqQA*/LL `i)|ӱ.&悖`<Ys '8y + @C6Ly@!)-xlSLZ?u'8=L  *ZS̥>v8퀰q 6 tR{0Ґ ScL>]4G<Y0.cDQp{*ϩhhŵw$k:26f   .jyνhHNH΂OU< PX j'\3<_S7ySÛfO͹z\|X$+G7ұo%H%s:ns=ϛ*>.&(:~^&i Re2X(D@3^o qk?YsSq=W{\6?̮P]-ټq^bx(sszX ʋ]ŮbWW b9'Ƚ_vPYan{3/կ͛ ϕT=.w8d mDMݢ{)2 aFoϠtRY'b!cF-2pS>UӑŮbWyUAx?> zrbxrl:DcDr(M!D>jH)I5d*];En2/>m>wNx>3 ?lZ{aU` N0]hUZ,@z'Gd ݷzz@`^ :R{&`JT!4Pﰳ=< 91 H\sfɔBBHNjw.zd! "C%!IԚrk(om݄28hi$9)pց1y<ڟ#6 ,AI@Plc4=hNiY1$*fv#xۙ_NC\ ӻFQF8JChB&) 9rxx"zyWeRxe$Gw`7Ĉ1qBApU(䵹˴,?tbE:1E\ԝCqdB4jGB11afde*CȔ6Nh0hS0 \LR  l0zՅ]RxNI ePhq/$Bi%D XZ?Z.CͲ-hLr:Źt*(jnĘ T- Q A{8qDly2z;yq[u@o"Wb S;h}E2fd 'E +4#7e$&4DQU;EQU^'sum|{df[^dWØgukIVNU ;G_ܘW "d=NF̶my!ў?5<HUĈǂ⊜}8p'=Kny;DbvƛGfR3moAyzz :YhৢSтL0(h|`= )?&Iʽ#c7*$nth#Ȃѱ;jUgciwVVKFU.ֳJb2zWR36ڕFE=.`Jl=5/L8TQ2!xۨc>B*ȗN+ZN4sӜ k MkDvCO_BBlzBG(“Di$:vLѿk܆a怠~2ojTsBu['o%ֻOb0Pf y^ofe-?4ifz`"|^o1[fڮ"o-kAmev-_eyFHiF|Jи749p"Y_Y7ÙyG_EƮI򺹍ݏ>8EaOoݪR-9gE=v쐷-RJO^ f-@)?.L9^}\˛ǗϿYZ[- Ҟȫ~>\_O1U \r%]@Tٛz!uGv.O`O:nwՏMZ!O^ruc-$qu>[^.$PA‚/Y,2 jq =IsCͷ3,|r0鿫t6a>{b1T|=Wo>uec&=L Ht?45&bYነVş31_;&l`e!%DZpaYo.._/_?|(E^4q.\\u1zӿA~^ey󪯟Ol咊`YR^ ]T‹c2j"hR ˱f(kd~^mPl^-X-'҅.\W CgY /fͦ{{uoY(^~=KMB W迾7_\jƭkۧ|o]ξ__ A nuaXo%k{q|X<)2Λ3R9>Y%B(FDM#0!cY.x>p >B/Q,"N0!\W! k@DmifzHIVғb c S'+kj}MC^ٟEi9Z];!k>4> N-W['ڤ+2{ghsmiYon?؄Ykd簓P 48ï.b>sK_d\۵۳dKe{M=!}ID:a1r!pL; VX#|Rx &D TؑT# 1'&ML*+`,IJ{s㩆ZQ#NوI"&sϮ5 N*m%11p1"eP)0zQ`j#Zy7#9OYU]]]]m@A /Z)Jeb{CpDR2mujIgEZ-x 6y i P̀PQܵ 7 MNH<#aRrh!_ !h K:h(RTZ fCKf'AS` IK#NۂIkO1*)Ix#})"V竴edޡ>S6}а&UC5BJ-lV -q1bVǙoBU7+E^@7QE>6 BS,k ij)!FڽPeaⷕ滯 TTQ1l5ӄJ4Z}(7<5<:2fWCBDj-$+ߣH >]\Lm)#IhA,>t1X0'K r-bJQ)(-Y r 1n`T885ռSpx^9??\~]>E:7(!}X_~}' d ]}/ .T5dۻG6\wcZOθ*<}P.R֙zVoz-GV}Ybo׿?4Oyw~*w=PهOW`W,_?TAF<9`Yq>RAGa߸{އG{1})8X>nS}W_mS{J*tlO ZW9ëq//~GBxr[FNs>4펬׺|'3a6+c=# r\]^(t&\u$:^v~VPcgJfr >[w^-Wi׸EtIt<{Spy2 <>py$g`/6M椅h/yu 㶮t/!m@`}'If潵_?¹$ZSsM-_شYs6QfkoVOS,?_G7_=w.c'8G< wrsq52 [%2`*["= zPX{ X&ǎ${S gAT I\j/B=uO-Ӟ4磖 yKu v0q`,K@n]X1a1;2.o~Ș7:?Č908$fEZR[>5Ga22GYջ0{F)|Ԍ89Ã$~KyH̉0ڦ"]R8KB&6c/ȼ5nyUAhgSP>c;!e%>\RY=x!DVAآGu> ľ+ϰH()4`$uL+X7)Z˕ >޹Vyat0MYC2U0kJ~g*kQXdV-g5UFqih?5F ̙*Jфkj1XJKS6&dC 5 >g#6V_4 N2VPjKH>DeB3^YijA#B'iXun4{kze)!d/i{ye}[T@7Ekk퓞\6jɡGT nvdkTwNQ9{BUº@Yɜ424PF>$Oԟܜ[Zm 3Z!֚U|!@TڝFV8VAѨg*b%;¹jS@~EsuDLTWDcJۨH(Oݺde!O/Y=׽{޷`+r>/ӨާCO^m~<ǚN5BO.ˋO^=7~AN|WWUoo6\O7X|cEyVy+X'm>^/!f_W#C~Z/86<}5nIo%}WKXeaE$ͬK 7o_('69k)oUGGfA;Էvt5o~Ξp KW<KӋχ&X2{k,[X2xl|9[^ĔL"66]S(];pqDCTn*~4LnmxÖ ϧqk>B۬pQj0ʇ ulj|ܜ Mjn;fGxd ny/yq7XSv6 Xt`m|DNa\򆑪zug<2ot{K[m#]9J.j-Inc[6b4tGO;Kl7>tS#V|>amV8M8Ӹi6+;iQboi {U)  dqhD Bo=Psu%keڪ`-jBS$[Tl7iΔ)٦b?w"nznt''=kLh dk̤|"VAKƠJM}YgLjoT}w׏gh =:?W:8'c_4곔=\飿]sx[[UIT8`a)vAq}SXe#x~M~#a7XӴd}X:ǚH 27PZ" {)ׄ'%hrOQܛۢ)8>DD4/ٵس v4h9OBJCff>?)#sk)|Rz?7ۨC/R8)5ܥX$>oQ맾x-ͼ!$>oQ'/K)R<"ʹ&oA\ s6URL.PQT]ں|AN:Kq;Pxgbͩ ;7jL&_.@7iS> RzuM2mt [)XS&P25df#L(:U}w15vkژEl8a.>qG6\Z'`^ |snT&ZhPOz,86>8M\xFF`g۸"ûêCr 䇢q{Z\~".8'崹C#W,E+Q4 9y@2q?#$Ocb~i|wWup?.֙LRKÜW'.e=/ŭjyP3G9n*hV&Y[9QhHPg4ρ[_A%(JY XFV g̖J69OBv^i1]_x.gy"jxbl4!, FHUk+Z};lN&,[fW^uG i`I]0apIy7B]O>V2A7陸N) QrMz>RB\ֲjvMeJC<8I%d1CJ 0$`tJp(l;% l֚%\Tonއi\ %TbPH1n]1,K)U#0% BlS|lD|^hLMt1ܱ>JQ<&ɰ*s8QC#o3SoRMӜ]$5l~K!nCJ^o" #sяg?\:MAt)h keÌA73mll9y90^#8qhdOmnpX4>ϯTjݦNl>׍x=Mr@4U -u{h%!tܛtc:} 41C#UԗG,hc`'Ww4L~M3zE+-MjSe?nߘoAyo1%.YwbkZ umn,H]Uq%Tz7kNB*'KgC "tLVS}|u~?,q/>|V:Lt~7K~Z/Ϝ?);aFj@IJ7ô]/_Z:r\$Za9JRʒ1iGQ V`\ r7[Ejxb/7C:Rn$nڎ>l$Ѝ )o,#ֽ$kkTyl^v0~nmxŲ1xZ6Ϻx[cG? a/7o#-- xݻGd ޽D>frѥNo$Gd‹ I^f>y 14ІQC15>}d^}lZ;74 %]ݒÏqzAo%LfcpT R`Ș!=F7d+/;i87bjjF @zD~R.󣳛jfNAK Lz}=#*rI 7qv C:3>k[CX[ZQ¢5XUT}$Fu?ZϮ(I4+W^:=uG֗9m/jDֿ*+Ӻ !_)4qnX7=Zw< ͗0TzBAǓܮ-/%! k׎!JW98>ⶏ)*[XkT6xGvܴn"v˃,]gqBxPޝshWu'49(h9RZ<޵S}F5DzsZ<甧jyw^ 򴴦:UꜴT@ ~ZkTCvR)Tu߹O-}u95N%[>GZ;|;iqi:gYcguߑEVR߾}RSongE|,*c_&bZVTܽkL5=YQ} InכZ]]1dN󗢤B zYQoyckц`" 99lv%Y @(*D[Hlaru  !rd!,PLdp2 !T&}5,BB B,qcRy"Ĕi%UʉʠdXiÌBΥƗhTG-e0<yւ&n%3Y{-ݧ8Z ~ǭ`T'[zZhinQsT_Ś 6[KAi)$)RPyZZS$lqk0yZ*L稾XIK\KQi)TVL04:]75dZKeH)-O<}$UpNQY]ʕ|+gT^ռۙ[pTOjnd,YY-. lӭgTFBbWz=a0JEc˰9-F[ZJ9`#Q[})[J* 2A}ʚD) p;bP[">H0q]M0k9R}IFd$츾$NAGFLAD ^-dn1 /JeUP{ č8*t!/KAyݢ䣘3F>-YX:rR(+Be#4 HR(Ls봗BVdڙʹbĶ,dǩJkh\#D9EAͦ{nKQen7ϗwY:Dշtgbv}e}Ki\ַd][߲ cV՘@oeiSb2)^6 Na`v Aw(6{Nxٗ/EK}7l"OJXkknF9Cqj/Sٵ+ڗM0FuD*~^7 3MR!oht Ju!(KָtuZ j;j=`;@r+NQOn&љ/[9LԺ/#ه=/}lÊ C͉w ժs.@G ܕu9+DjCPI%@@WU,'wD;X0y*&bicR㧕3U!lERj:p-J* gƳZ?sgWɢ#EzUއ~^x07ߖ%<)ylb^R_o\1EfP7_/>hHC~G?\߆{"E;~ ?x}}t6WǻǛw}4_^} ¹17z:f ~RHScLhLd.X)ośOH3U:AURnD82'==oc}WCwvÒ/ʓ/YY}9Z{| =wٍ..۠i῾k,460"76M /4,29SVUiuO $!ǻ`ګ/l^ZuMGzoG5=cS}3ẘ\z15\½E΀J)Hnl-V])b&OR ֝rM9!;HY=1B[](QBYΓ%utԖb\ K{n2uҨ^AV.=C@yACpbz#\;7|o0ϖA=xѠ#LM}'N:ь{^WnT/_; ~rzO5Iz"'Zۨ]J|%V׊+}zS @){onvZ&?3@0*X(A1 $QԆSIvD̒U LRua0fǞ'q/Z.(`Hbtkpz$ I ǜ#ߐԝhIXhqS\;[}nl1rw>Il_i`iE89 M?c>̍[ d$FSgt7.VM2~S:)}҉Mpm(gH@5"ÌRVeAK E)0p~:bc OKU󖝐4pbaӁ_<|419gZwwQ_m|5)g?JpL5RTJBJB)fi[ys-VӖ*c^㞹"y8:#P@:: Yܟ&Px#PҞ|jø $ A$0/ycX2RXa K `eE^SV̽^AL !J弇cZz r@kANI1xa)m$``D rWyc+fqym]E8EJ5)4eKG.ǘH0FzK8Rx8u8$D+=?1!ό')"'LCo|Áyx"?X+Exa֞jEsPj)H%OBju@КVmն<[g|ЕWhոtʮR6]Eγ g8萒`CF,A"AXJ.Ln6r`]ҟ6^2w{{<{S$\g`ů_?nm1w`Qk4΃C* +4A=`# ʧcF8"gc$qC-uH%E"F+0~^pCXiA;̙Z~TUVP !A'_~R,ͥG 9yHo>!=.)TDyt /rwr1:Onqq'G='7dʓ5*\fϓ,")1|+NNB$H4I.9!Ox9gE |-[@QNRpȜ; + -(`%0*y5Vw 5lW†ʬ;`i){u?;`F[ j~jtW}Doä=Hb/G^{*kz/}]S_O>f^cK)|ܱP̵ͪ^aS`xԬ|JĴ 츳aȞR4- REŰ*UQJhR!NB 93R5 tU鉛o ږXUPpS X°AVt)K#D5v.j NYX',VRb+ia&ՑvOjW-sv.C+tv.r.a聞 wl'bBA[oX\xp)XM-׍l:\%3iZ! Cm5`O5pg|Vm9XÕ2*aBrTFpD9Y)+Y In75ǽYs@;&$6e):ْf3#ݿx%nk]m!$G+к.XRվ*H$aXWps^v+$ne/Q"D:<2^>MT\e Ұ+ԧ!\!RRNhjJ(TEdS`k^rJT28vU }|Yi09) 81cL8 Yw\U!o->MMZBYa;6 fG!{hv(+A)K7&UJT|G  ^$*Dys >t=1;Kb07(1\=)>\FYUIRdQ" a)HЕ"%s \$#d[s^ؕ#B@HJ¢j`Fֺ攽WXUއ^o#njOcK۷P%7_/>TO"TV>6_]`h DvF?/I|F{v}{0^}I@QFOg̗'Dz=~%i2www~خ?HP)o(K^4`oG-ޭA՘G i}acYTLxBdQ$2l!b 6[P~9}weVi{ ]Ic29F{z0wd㶕B͒8[')fE<9ž \* y%9Y=6Z&16Y)8%tӖ G4OddQ@2qbocZ$` %UlMmv/56a7~9gd/K7EUndO5È~T'r$PM?ԁ\c>P'v'H>OPD쐁&]  t@A*Q9@炞fjA{aa;n4ou|+~vܖ(j !蟓75\MoL]%NA+8M$cw+/fb|1NUhu,vP7]i7HzX[^Zl2Yn)6ń8v˻IL)xTĘN;x#+J*ng-|&dS|ixTĘN;x#&Jġ[z{wa!_nTd3QR%QfL򸃚0cF؉I/Q.Ux.nn"0Gxz뻅OSfZIV,k g%c HytBlyʛ1ۀ*¶JR'U1LRc*%e:?$0!. xeMIi:O=Yٝ/0~]VC7ä g2&3k=YhMe1*5+ @B>/)G<&/Qٻ9n#,%wAfBd K `w"Yrv׎/GΌV=]vgv5[_b= bZ3x!ㄵGIxЛ@BөGTںS@"{ Z/ fvUkHze_Aol:{E@K ʊW@rn>^:ksgh"(nK^6e(g7A?Vz:)] yud-uw$(W,v^f A%8YG\rV,;lug\xmr3bqMXiԌ=\Z&cʿAuvW<|2$81MP-2g ט髹b-ryi9+-~4=/u{k#GUb4}=.fr^ccN yQ9ZxH@Dܻ< 籋 *6'R-uЉTbu(]趨̼f{x&Ze|`bKp䒁D:}#LH/ľ5#w]k^j E:G!.| ^\W TVk5SrUFdd؜<*Q!mg$Wh>f K"v?-ɢ n!v7kJGDQ[T*!:JiNl "pu43U&37j@8m5k|hWtIG.9X8S̟\H9 D昈KsLpD[t g9(1YI@i,B{eb 'RL6eDؽ٥7Gv@'-Y{B%z0%^}GRLBJ?O{}71Bq6K;ؔbq<[|2x=r[ObqMIiܤ=\b&cAuyMb2bbw¸NKq&? QzH1殊9`łRL⇟?{q! ̖bROSwi[u"wQY4HzkO1)y2N@sM1dg \5NU%2#{Bv]LkNbq`Lt`ڴـɦ4d2-+o`B TuX@vun34JfPKB;#-?͈%ieVws2Bkb$Sҩ&w (iG Ql&Dku wWca3i~o߿Ϟ|1S+qpgrl LmY`a"&{c,j|}Νu*.V7w8lܽ{Ov>~juke߼Z-o~nog9wi7w"gxD>,Rͫ3(_ޖ71;x][/>Տ}w>~/StOW\Ney:¥kލ{\&Ґ/\EtwKsu:uŠDuj1ĺѻ݉l֭>'кM!_)þust[-%S!팧$S˳Y4B6|*:^fT' j۟A :0Y aW 7'$*.,̎>?1{X譺uAwpvjnU@9SD*12BEt8Xgt>f K` Nw0-$xdZEL1QmLT^IRr YanQ{lZsc@rv -AXʼnґ6@@NKhaoŖtʍ޲.!8Ohh0/ NTf958I&ph=JP"&gʦcnm0`gb hxbַNccsBJ=4(D/p3=<(d̀b",#,QsJ KKi^6($,Eg2veB64ɵjouKАfX{tX~LpIsEq{7^wwAiɶFqm$Tc1pӢ'g9@ŘvUl>F@ZݙiWH 658FY26Yc]џ8t>]?7zyJR#qZPtZ;^xi3]BZb٠ A%(pu +9%Շ`Ĕ+d>:i)EC';y*a7$RYrH<$ 9Ѕ7rZӗ9J8Doԉgn DʦqrQ3tDEU\ n#1@vΐpv&W-H [5Th@.G D ]c#+PlCÉ),9kQ,QeW%_qR^a 2߀Pr=%`%C,\R["] wp~vMV:a7_s5)a]ɘ(9@`ӯhW9Ko>)d0JOu3fٟO f uOZKgUi+c)= `mF뷳QJ{j_0d0lE{+[#mNNAP: yǤ$JBւ@6}}%2rp諨v/"d#'fu(эDڐ~ >6AQ[ kT~ )NZu''혥Tf!*HP_4O[JΓR+oZR4A͈k0OJBpR ')=j)E'XLP!( TcR-YUɱe❸2MhAAR܃ k5ưsHdIv |nm?.0ޓ zԷq^+Ňis lAu0ly,pŔZ=0et-{k+qWVL> =[kwģ+ !/V5-Zaj8,] U7fBQ7JeB,qB3bBE^8 HUоزƫzԬY˴*服4ģn<$xՉ7x"հ" nO0~k  ?^~#9l6~5K* ڡ\%#2hDTB+[1!_5)r,SJYJ1v {rGepn83֦qCi4݇0#p4)y!0.䃴>ɢFcQ_ķ>iII葊RcrDĩP~mx>XxPdy$G2^D"xz.O5.e=Ep ][qTq>xQ7K~c*sJLzn RҊY=5zL]/j9>gNQr;Šg -N+8 +G:=%Da "Eڱ$h@,WrްNOLCCG^- y@OuzG~]t2< )e:OJY.TqR 7! ʎ 㜖J΃ ǢTX)-joTV"Uo 1*Yu UAMimio2OJ2 jK^ 5v&=f)ՙդzYM!tf5x*W')=j)e' 7(;)e''͓RS)d z{ޗZ{o^@zҋ&-cs[Q-RR lשnriX|j-f@(6NDc7x$Ŗ5C՛iX N,1 tRqDFuŔŢP91g0,,-y'C8D,F78|q夰qB'*$8_ݒ'B"Q;0;YLn*%wf)egtμ`.ImHij^QS$ Uv4tqB0#yޡ"BeYnhMMv3t SZ~joRއ:)~G̢: e \(­ 5e4. ܗjG FBhI N!Haf9fza!c߇I]C /A}ф85?`ȔR1%J}֢L凊a#L:m>}duRJCToZ=4T+\Od2񹱿^_?-Bk$ f-OOӉ+Q79gS5a/ŗKŬzWxw&BQ(LJgwldǷqGnNrLRhSO?idde#7ΟL9p0W7:k?I8:uN!Ms/dVg"ZX (H)BХ*5%hK⥬tFJ%Zlu heXiH_V՜R.wd)2v97wU(M׈yU*ɠ]t$.s]V govX~[!y:K`]|4PqcXEF3*wFvHZ KYS EU>Y=9g=gsfK}f0~L;nIdAݕvvӣsq3tjQ) dA6GMQF u7}7_g+I/ģ;,-}<;[f)z1]vAWnqTKp+'lPy_>]|}n-w_M7UϞtή#6O?>o#-/-/-/-/\7g-L5LqtF W7n"SOET%rǿ x-;tB|.<^k%4rDm9H * 9mG@ 8tRKvmޛ"z \fV1;;z")Rw 3MfQ xsyz8bl2r~WMO^Jt'};jƟõ[pMʗ7 })mEƪYzW,D*zg3XUX׽Yb:Nz4U+{Eu h{݀=j4kn{te"Ok⮵[L6EL8k7Uk'G֗9w6mDej!$. ʬ|HȻk~i@6$ofܔFu{yD}vçWh@k: ؝ַ(l` 轳JOo/jrGdIw6Dd?Ջ)) !k;w7}h^@?Hs] ԏ/Rz{H5?L{֤ oրM6+>4\]*5ثۨʩR9o@sE$k@xߨ0 j:rs[e J\Y@U&D 偻|vrרXrZAa#u<jv+ěNAM#C4!6(V&R*􏓯.I yUH 32}{}ܛodYe$I>/Yς"D٦1F%-g%,:\ؒpJpGTKi &|pF~gYMW[/dyG7I˛+{|LS=>E$ϲ^Y9u}xt)&*uj AnCfK{*)}`p*dtT jOtݒښ8Ei=79rD)L}|T$TѪZ]$Z;YؑUWIqUùU 0BlQńe U8fiQ 9X=VDTH^+RXxS Qҟf+dpOzc9pi>*B(U /byvl":v9J"8*ʺS^2Hj)9oؒ_;T %RsTo];M=i/:ZʜpqTpf1l`J;m~S2Kh5#Cۜ7 kKpr";2"<}HpË-meJFĖWO|-Qѳ)dTX$VZ hbԬt,%[E z(#qO<#.A:{M7Hw>iHhؐwnlkT7⡄8·{aZDKɟiw*|(_V||&N9cɤu\ɉ%zesei k!Qr4+ w, g([Y*n|)GHf36z-)mIShWc*inУ; L)!^̲DN XDMi;"66L K 'o/m{OW׀]^?q#g0x{@Iq-7Nc9VD ⯪6,|n[QnmE Ef5}x::}$m %6-ku!뼠$4x4$:ْjUro:lN%W3ި^rzkʮCE@;oY0^ MXޣ^2KK>f?d#6"P3D4P)r yC A2hFFiF\0<82@d)##\SPE*d(0[ޗahŠX.eFԴg] iqtaT:6K{aޫ px{-YP"Z/,JiRs(Ǒo\-w$gI⻕ @DcbӖ5> f!!1, |>v(AU8P7,N=78#G;h1tICeQI3B<:՚TZpD3i]P()U+y>Dւ} 92̚Rɬѩ\8K\iII,]Ǵ0v{ % n?q^&=HHc$Kw B@LٱȘZNnzü",! ngT bd}E 1Oqhl_^yD@xxomUytRd fVR"Ŭ:(5gD%ʈ8RLjoA*P2ghgg셽؉ߋܑ"LC Tob/r+AI bʄGMVŌJF\I%I'\Si$cUЌz4=H%Pc!ԉ[Xr؜[pTZdũgσpzҒi%k~B|.@СQHj}V$|gfapJePWۉ_C`8i)k e8&j2 3 ` +F@P@BY  @=v\nمsmV.k+uuGtga%R.L 4uKeY IyzK?1>(H*_䓻4'~0ҍ\jT'vA^̰#Q0@?'lD<!<1 {X=Kܯ1zIY,YX=,i]F͒'#)30e$d+]-0ꒌ<ycP#!je iL xtQd]{F;3`v_d]z)v%͂>NIZXGICONJQ>_,NEkB: 31Vp2ŁwV@]Lȶ@f8QBGxĥ:rj7x@]Q9W`\$P ֕'IIԭiJ.sV@@N `RH-(<*k=:Hn"P#Ö[r3uCaM˽&NXǩ[H֍L!64O?3 [}ͨɫMq'x>>޺eQ19iա Q{Sw{zkFn㴿bHϺb0Iݪ&u E4K$~ݶ̨ri":fO'n`m E4Knzϵ[Pri":7ASؘЮ, W1YSR*.RKK)]ј5NЮ2m %+i)n G)T"(0I-EJ&` U ҁrhG1P{3 NMy{ДZ[Ǵkg`#I~0 E;cJ( #GDLbw YX#ôHd۷ٺ@eWpގND8i$d8w(=Ԭ]3tԇ4KE_T%p:E+E2}I|d9vqZibd7MSlb?Oܑs*s5AmmۯmʁymՓq?dw@h4;w%+%1G}ä~ШęcH H$,T=FXF-.j+;8BWImyn3 !ݧPd@$0 \(# |) K x$ {G0x$[\[e{erA¿Zk`F"Xu[ ?3ELRXƀx[_TDִuUDަ2MoP~`Oؽ${5O!-T0t%3E<-S͟i}ؿͪiI8s9ky2^@P >PT-ktG]~{dDzϿ*ԗ l0l\vZ^IA8kntcgm80";k+0_4)PV^7HN Dx1+ɨN+[KEqA0|7^4|NH@UZJԕ1"mxJJn$@ǙvweWI;:.ʡwYٌRL4[ʄR+%JC!@["ZgdLC̮o 4%LP"/9a J`-KTKN%RrivɈH(R>RFRFAPjQbyq6؍@* \ؠ8LRJL+$LqbQ-!4TJQzZD+c ,@Ø8Z;ъr%Ts/)Z5O1qbAfRNRk>2\%{;ȠrS`5q> wqG -[߭o?\rqm#E7b}pIoWMnɁB*1!1ScNCItgRLQ/ḲzSjuU)J`TExU[I_6 ru[Qg+e[џjA`Ǯ=XإYu]C5vO~cuu]4nUޥP$DOW.'6'` R.ډq)y~g\`zDCs:3$/`p%'lm_| x~#rgOs=,y29Lɔ>HaKV.9!SDA̍ 0vKA@Viq>5G6ragR 'QlY{ M[=ց5#q%S |FUV-V6Iw.92)0jkSn4Hwtny_ɴ[~9XvBBs͒VΩzϴ sȜ[& c;:n3"ar:-?w@Bs-SqEw4`4C"۱͔KBhRoèR)e)}p5N%w/\iAx,NGő;ǧ99> {;NjĹqBHN̹fI 2ZR{SQjƔ("AWXVYu8w)Z^fDG,D<|ҭ1n!XPrsǮ;Ew O%3uEj\[$-N-id& %MY:&8GpT'o u#b+Y~Y &Ϝb"*ug=xI8"xtӚ0Zٲ(9UTQ3rP `p$#57\~1R,7rX쑲 ro)rO3_:"0̹[KWJG1dEno(y,op2(*hopNR uJ=J88))p>|V D-`$T9[A(lbV 9+W111ƹ'!/AqjhI|OFuZ6u>)}ꏻP3YJ_R&T-IHKP܁:쭳r)e:MJR*: )} ;P3R MJj'!/AqjF:^bv`'!h5gg)}R3 ڱYm7P2)u lea #LI2_nƦbNw|MD\eiI2QEYrYTJ/XYxބ}*Hy]ބ;<j*x`NQk=Q3ҳPHϼ+C4Sх4jezyK}Fz~ UyU|p6o;`D0c b ]3BRNJ Q[΁,*UA3H8o=JTȭeI!*Y5%HA iD^[[{6$ / ln~V(F~ y8ȒBRd߯zHJ)<83)0%TWUWWutYp2.6g&LHƴ(L;z ١H('JEY#DAbCCE;1nm GB{%+ѳ#1:{=z'7YQc`ōia\he/mY܃n&Y܍hwͥGN$;NURԜt8X7O> BY#1f=gq[$"qcIH b)ɌLY13F4p)D>E jМ#oOw#mLΉLK5JK3"H3L @LK Զ>sIrin,\yZ(B huNv2,}/Yy $ܘKԁK[A}Qkv[RUfJ\LPag=唹F汅qsS$NS\YT,~۵mdX@̆ mr.s Eh@#@ı5e}{߭5x%hb.~s#=!|4$5~5vAux>s I3s8fdpsCs훯$}|2ȨIg m{5tA T ٠ZuX W-Vz1J6oL\wǕe >' v )PXLzL*2!R ઔ8 klڊz*gq\Cckr]bUZ+븥EjZQ6{:k;xk4A%H+Alqoc$5;t!N]>-x)wRmdD >T&ϕxXK1Q SB:zB;zTF/NQtU iY78 #m-ދ \5)V9/Xՠ"%=!(, K0Bk%ru;Ӛ05!"\=4ġlY3@{ LʦJaвRgOvo&Z}V:6 Ӷf0PK,xsI,^-LlngղKÇEPZHuz3e,_{z{ ^T$Ru_;3S;K gDd·@S-^.Xj "|uJBۓtb ^*ucrCrS?J7z[WİNw4ni_R'ݺLFJ>8+gN<8H7Rq ҭ+bX;H1)pҭ{VAtCrmSH&\]=?W:fEmdaV[8:9Wn 8>G-MRU)=&|r]R􄹔w9бVi{6ޒ%w5 ^ԡXQ{*8=c*A]QȎ$_T|Ezv'h(헰$˽w6I*?G'=,_ֱPBc}8P:,Biaک/;=ۛ7[ /@KSmZ|L>Zu|'lUw,+e` F59qҎWN,S>)4D2*dΌ/ȪI㥾[lVb^עww$㸕G4jE# ܙ6hCCЄKkP=[ Uɨs98ҧ\y-Msu!̍,BeZspN9/w&+$~7ZCC6>H/@}QKӾ$\*Zgwɔ 8uFSH c3#mX@zCcxYv2rnI0>·Nl5'C26Xe8F3\p{$Fg GorLD&4jpl>2QgY6C&Aԗp_7AaWȪvA;R%Q%YfXhj/VMq_|cq>|=￿*C =_^}5d=!nu~n˰I.& 䲺I63ۤ+DŽip"o]`ܵpB"x3 z83 K{L~OnPl5eD" ,S %,D'9Oγχ̰/I8 tܝĝ\ji'Y4F4WQ~O@Ȅ0$xW |4n tn?.C)_R2X3|)!ڛѳ~}VDL m䲜-I` &7~f?b(^ 62j#B ӳOodsQZmRjv햫763PÅl|r΍"c "774WX؞3P3ڐiђppw>pG  فeJKZTU!Dbqs2i- n[gK*Y^ڭvІvoگW\ Bűy20RGpyR;^\Is4_$D-*.k q-<X;6*>w7V6b+^hGXLF#a-&0"2{A5!G+uӐ5n|^fdXIm܃ #e[ rw7:;Swk@Yj@;;f-kvfP+'ew|i5r^6J9.V+҃95v{bBL kWĴ&]XǓօمY OA%jg)H8ah"ӻM-Lun}pW΢]xJkO181MۘVFenݓ"[hl$Odrqwϕao9[ x:'~ԜUñO1GzxYrrǥ%Pͪ:s)r8.e 7$TF&H>V.t Ў,J18*k}Qxk+o,8Ud<KT{˵Dѧu6|X*=U_}OH[(hPUc\+d|O"*ce_D $J%!IX$gLt" 1Hdł\xXP61(%unnfƟgoCڒ?<]q>Z&u_}jKېʘOwWm{ 2$tq{KEqPvV-Ԍ'#v2W},[i,R\BҦ;g&yb56Ou>9oXr6; |gQ(m*\LVDJi!Hf>)2*%@۟g?\қ3772klV2z.[ y2e? -Ϝ`)p˽ lNXkY\KGq+-,Z](³z>Txz4v׻lLq~nk s98X'-Cׂ#dVTuЎX|r=#C =Uv]+) r.yC^,r%$Y3tg_Gp׫||W" :n븘K,4޵q ]wJ.fnjhb_쒜%繤6 ԫ㩶{m fB]JJ*ATW<Pzn GP%WI(BdjEPJ:4&I9s,X* eRgHPqvHf Qv G\v+&l-fU͏b4t{w2CҩC4¦qF..Ow7鋸}7gc=D/]KH2>8um Lg7bW)gf>6d.׻`!߸)O>nZл A Fw[:S{em McS֡44򌬼[fdȌ CY35<#kY*s^+3w6L/2֔/jW6r*4%n#3G⏚4c,dw GE,(nRr-pDց*ÔcU<52γ:Ƌc[߶vSz}öKp\0DgOK,fl;욱]]`R8RkPpuE@Z]v+r$\p4DDA# *Rh&>XbAHA,RUl:sb r3㜻njƅ`+vEqe(HB7GJz>]DLs:#ɹ60˳n& C0]bo6I0J{OQMfZOUBZQ[tB 0(pCE=wStqRqK()G(ЯErJ4@MQfq@H2&LcV#)14\'׋H< Ds5p&&f:ՌQkLi5"_ QO!cmUҵX}fQX0O>z)2҂ "D|=WlجdiE[~Q:Bkp'z$g$s #Y z #"WQurq2=NfvIk"z8w̆QLܚ)f}ĬdDpSʘW})bonslĐݨ%[ {7o@p$֕GrZ]I"dƁ!9o8[3b5o"5Jm΅gU`uG-+ hWfafi8X-F&̑[ i崻~ZՇ3 ib£A8RyS ?_~ λ/‘F}zE6dzL恢C|6-w Y+,_OW΄:(Hg\g̦Qr{OQY:ע.NzJa6l2C[럄8$h3)d.`ޙ>I] V.x.l{HCS}{ӝwSW~7D>>A1k>/pO}"G6lKȇ}۩Y\-%4x}:M>OgZz(uK72]J7oI058&=Ux9633$杳N2m4 *rkrWC5 gzP/eZ&t2}@X,`xp] z& 9?jS9i- )4Azee-ME B bVӰZ1kP1X{+I8]G1rm;m'o*3SO)</qpr`ؙ\YYfDT-xl֍`pq*/ns]ܾws[eJx.a6᳽5yu.P} ]"`ɾ=l"P~\:QIXTV WIcXnL2zh{+K^{ pH[sʋ,pTDJV*(ubXKykk]ayVY ?܏Χj6aD2JHJC5sX񒭔2sQL(<+}.9[KҼىN䌵fp/iOQ}G"vmT@71yV* JK_Jg2Ex []BKZyVP͙9җm*sWi6: 'a*sW͎RZ`ys H2o@םҞgڣZP#tQVo- G0qz r^%֤%ڒ?ql^bռJ#oB/%A7gj{_eqhw,r 2/d79=v$9blZofw3bUXz*:lyGJם}( ]7&„Z'9a7W 3R:1?Vj~є-#D&ϓUk߸;UpU%Daֱc25RL`D2ʕδ!c1dw?v,+;OaǼ$7_Hɭ[_ب<ɯ51G$YQm)MDi=8lJ\Xߋ]lX̋yآ6ޞo H|$QG2֞(#1]|Yk2'pj*#']T{[Q*ZԺxC.q;xvdXb1ҵ'.JW݈<4$@ř_QJM"HtnTAz)W?})]lН$+CݠrW/Q>in_~ ٽMm0ty[6C&4ҳ+%f$%P¿,%c9rVa)Ѫ'LluMh="ZmTO)?O_;-?\ŕ?Tg#T9P|a7Zٷ [ mTa[GM,uIK@{jb*0p09RHMЩJRԏ_Ax)oZ@<ѺX+~fg^EW E5&qR6ac¢Tao#&똠ZkɅTXisEi[VVY1 /2 yA4Yf#-b!'D)У֩,Nҭr3vkߦF[%%c a{'a9J;>A1.Q?CҜ>,A7c ,;=;D `]`؃כOq"5up7?eGJҩs%|tMFu%]K+ͧzmd2#>Eu^] a9N9!դ 3f0!=8 70ŀ%-N7{|gBINV1*Ǻ)V qÅ6"fA#X#i 9G1ڂu{X: HfgH&1(~Q/ OuVg1@,BPR `ʾr 6FP %NPWS'\X:?7gb~|èc+2:K8;=npTu<3p|7X:EI&ܩ$gam|L#O%uf (q.%7Uoʯ^ަ㯿lquަ#XD[TU΢L&$%6R5 + KF.^t5ZB8&'UJh/p #SB Փ /\xZJ"=gPD@6DԖXP^WI#K␤Q `8W)3"jFݱFy""Ja LNak.ª 05t`v\,uD gE&¹0=e7/Vl8q~&@|!BJm}L&!uյd⣾r3YSǮ*^5&dӻ ǘd-ヒ9AC& 9F")a +V2 )%{N'g'뱀ù'Nb@Q:T"e;rAF'FhH!e抋5*:TBsFj5J7pAB~ i@LqoᥕȀĿqmm.fsn "C&m[˸NHv1aԪ!l~[¬OW,῿~;Z͟>k|z"Umd 劣mǵ h(H*cl 1xƑlc Jy*gδ oF$d IFv5Xxj@K".@Q>4eԓ"@DIL{n2ZTK h%MQx:j3oj0>׈3`a']cS;ܳ9&_}:z9Uw8 <j2l&KnfޜjBd*Qq*mlz\Xܜj.I)$8'ͩ$JgA+AT)6x7R13kfHG t9ϷJzx.gv󇧻b5utyʃƿfro //|G>/H.#ܙ]<̥ 7c@ciՂvQb% U4IkݶnL!XTĨN7XӅK^To-,Һ!_&T+x!u@p-Q1bL bdح UN1Ş2*8ˁH '}S+D2pEVëICCRf؝f,?:<xNd-Ūz^ߑD=`1g-sOO"NwQs73ՖI{MI/[Y;0(s⮹_-CNИ5y}$9xEͩ+UT M2zq-HlL5e;pR#pʃoԹ}GmhWI:E=uhu@Lӏ '[ ZK#l=Fs %lJ$U$8H/aH8=jU$t)h0~>jǟJ'S:6k:0R_ e.F[\ͩ}&;7_#g|%\hXlBϼU:BZuV?e[&1ȝbnEFdD8c#|YY7aw\dӒl@3C{!z{t6I09IpN >vs Ei )cs\Sa5dA,|cNiiؼP`SZPJX .@^iE Nb T!,*mm$G$dBMg?PlapVx5( n/z\(u0)VO^ǘ눇cVd%E#%JLj ^l Rqs8[8G."7%*V?_CVmxR؝ȥ1R/$7KE 1BuvN$.(2MQu )3A5DQp9Re"O,0M[-8;LJ?nKYH{iRbZzIjHGBXRUgJ3$ \ Ʊr*}D)q+=ijI5_->uJ:EeRSzRxsR ύ% @KS=P2-=G-@JLTH?o-"NKX0Փ*>&-=k-mR0'mlSzo,w[NS: =U$%5)Rn6yl,wԻ&8A*.UԂk}>*Bv#HLlܔuJQ$% YY$B銔w|N޲]b 6OTgG\gݶ2?kG`v)v܊[d`Gh:0[M`:V&A)xAj0v1ix(OV>ŠW~UqC*G JL/<#nv΁ ԂSRgd%)2l[7QDo%}QYn#灸%d;ƈ"jpiA|,h*C=9PEͫI% Y\^02R30g1t*3t9ԐB}$P(mq&J" wR=z΁#:[nza-ÉX9zz9PםUvsl&t438{.mDDy΁؍ANSUZPl˃Т`:tf;N*Nv5p|s ;;dekynEnX<ǿ$՘4JX-(l'Zer+V䅪bОx̢6jً>kF@xQ^/38kr&(%)bZW\Z8BO2 W(T9=v@+ =WVs.̇@zBɘvlTޕnH)J!fA {'3@H=eG6s]U{*57JHNF)+ 573ޖ),w٨5LREi3Y bI\M*gg?%fKïȷ~ٿLL {y!&{syòzUg[`J0BJ|J#%hpIkwT2y*V ^: 뤎T R3+шWM$Ȋ+ɥ*IH DM`XV O!+Bc9AG.Pt mtE+L& $M"nP!K'ϥϿjUzYYEiI/oz)Vq[x|B?Ҽӛs`;&O9x5ʗ5C~f,-dtgA R_)2٥ZK=>\PY~ J0}\ bޗ(>x\x~R[z=,[yxy0O#ͺr~,9!7S;M+Nbs͇w^^}*5L H3 ʙPY3, fF@s0@` yBikp뗛u}Rܷ {K6(,`3|W ސ~>~jQ^70K5,CÃx^w~8/HHfl#ܟ KҡO-'OL պͤ}m&XkOqKr@5_pnZ==;w1IPTE#d4Bffh38x.#B az5觡|}6S }ă `zIљ}6 矞b? N PG" k-á.CÆT!9eO+pQ}Nt\b0]3j 8Ԧ'Y 2y+?[ϒ\m^ɗ/>~)Jޜ?-CǛ˰Л4Ync?~z1^-ѩ==a!_#|7_Jg8_/^za+W$;Q2ЛwD6533Z|Slm|YvBBs%S=·p~ m, jDgghcF߳|h6RM!!߹nTe-Æ·u~t>Tv·ҕj>YՇ\t$k#O)9h(`IX,SJrxXBx G!KKa0;g4HPc#.4ǝkPF A}ųiV87mk8mo?Wꫳ2h$S##"t{>u#s+:!FƟh2mn Z#gO"Tx 37x#Q!Ɓ_kJHg fmULִ~KK*]JW*fDf4;"Gc%ihP+zkr\uNRXE{\#- Lt:e"1;6˦yg:f(Ekg4a3c1]tasN6$*lfz2iYR(er a+-JNX˷t=:H`KsQ|N>_nk#&(nLFoFUGZ#8]3n/q8ZINRIdx[_>ѩ}4IFiFP"T D3nW P9OM;xevgWQ{!?wiqқcsfR f*~j!bJUaJe2S_KSkSY! V7D+g]\A/Wfzx53 }ݼǚ_ :*HH B#Xģaq`T(2z} f&MSE$/Dp9I1 Ø6hFp&@6D*ݐ iFhjnVp9 I}gޑ L 9yuQjI8CQ0БMZs+@ -|G#46?ƩHTfH68"X@HE Y;\EΘ OJ2!#D3Fi>jGk7Xdطd_F`]y6F^PdĄe,5,dҊQn%9˿$ÃDY\yIeyr$d4FywTkP-\T(&fpmPц"ͱ^gێ| \϶Eḍ:*0Yf$G=߂Ǎ8-G35+ 1JfvSaF&ϓS'5pg3{4v[UƦ r}dvP*9@{ aisNoi:&{xg -@9\#/O ΢[KٛqR%zٵn0&.QMjmCfxmMe M>lV0u"WFSn5[ٱo2E~K|xa)aռa ,ጫL̳H ϟMVa劘5+SvC~lM#Tč~2,ٽ))C4Ggc{{BL 8]GFG^`pkPF QV՞nVuD@nξ Rܣfm17x쀽ZG;:0uY(:CJ(ݞxr̚[Ӛf 473uҾh䦷0S/8x0$x%|MzA?F:.[7+^jdm#{#"Y ƚHu) ̚ώH@P!;+:ۚ[pE(ԟUQcQȘ|AV ƜŔDr&es?k%LY;5(mO&J"pa<{ta=*C3[fΖ$G%{N\uN5UW`LnDd\鬇%5D`q&\{SQ;Q?as*QQ j[l{ՃТ`4-R0i&;ib+Co;tȢS^jް',*j!Y^1.E}tQo m0GKp#F*s% - H8^V:_zR];&+,lG.eWA`hC3ʪRwWqjUXLY B:+›AFRy!eHэ / ~ׁ/v޼حɚ3R1Bh&t:2fGL#9^ܙ)p>PIGt4=ϛN$)c4p*]mo7+a{*" _V-n?Mvۺȱwl| ߗI="=3 @bYSbz#y[]6u`5Jz.5SqF>0 \*x`R*ÿw- V(n_xVhr%HFeճUL[Bifu*6,eL_H{$72,;^'0tgX󥚀MԢ %ISqlxN-Ʌ0b/wb"X0#'p/I d%Lj77<]jISKORK\*$ԒCdjh2rԒI'KFjI8ΏgG:FjIW36Yj~@@K-'JXB)8VbJ-VuS)ۛ8aWjII|쒤 B^4Nӌdg;uv{H9ںV [8Y'$-ϕXς'p+ͦԉLt=iȆ_rƜY ՋY)HM|'._-ӗciuxڏ,(>da"ƵUkPa͒F7MXݰ7uS[8'7XG ۾2?mn7WkWH>z=EHD8֚w]"1aQ,PLE+X89;󆳸ÌyaOOa?/rk޾YL ]h?} '̕lȱ%HӁbz2Ұ7 ]qYt\0IOT:Mztԇٔ2S>D+$;GI%n)oι̐f[CKYgiU0ykpkn9WL;&/\0t&Y}n!9z[v%CCNE@7(5mƖxK l$Jyr*iEK`ւ?gtMŕK<{ZzaۥZʛ;k#zy̟̆_gJ|,N9nU^IkOC#E^ޅ7/O G}2&? .w^Wƙ1&Wk f:ǓzDHe~vi~@~x z ZVja =߹Ǫn6qoBoBoBoBo ]>Vv[U8oLxH0J+ 'k5W kѼ&Ι kٴ~}Z^Xwu]_-Iauׂ1qͫ&;ϝ/xo&"Odc_'.E@IjQحh*jb 9#e ZUEY0~W##]= Ht!;{B91<Ǣ\*Ӎ5 .Yl($'Jk<@dH:Ӻ{,5 ڠ$ud!TY=cʈ$J1gȨɜ#*oyͤjJ,;5 ^ZY xK|2C8.eg_2جMrrOY5XjbU!MTdtJhS~Ư3T)]߀u:oªnH)4 ɈNs&.IC/a>ਘ6)fLs#K$ )s7V7Fۆ,P65B22M_t"UuYd>gAzPtZv/K jd䙠U؃VaKƄ0kZL<⭆Z(Tp$.H~8Th}d$gJ!mCryI`X&.D_wrc{nD _]&f޼{x~摒}W\_{!0? CRlIݽ#{;% hy?Btف pZGi[Y pŽ,ܒ ;BUoÓvJʞp@qC'p24;3?;. NWK&,3cZ RTt4 f:QK38x-ͤKx.JFRYBAJn T9H5iBP1b ̯v.y' -M1} ^V5/OUPӇBikuE7\sCInce,,\Ř~cHRqhtD]$aИ hi XF'C8+(;Y v`^(]u&Z-7, eWuh< |> na%+eXe} ')Фd_>ԜGʏΐW (sY|XZk7k]Zs˹b1xt `-,xpO P'Kn6&Ŗæo`7Z5YHV TU[vAK |Xdypg{bm1o\o_s9x0aW!$zxFXkfK:pAL\>LszͤS5b RѰNi#̛tTecg%D5n[JNx9ġP@g2I+[ Q~ %Faal Tra醰a փOeX/a-UZ3^*' FuOʋwfi=A! zX ߦ;ˤӳ0(,_}|k>'npg`~QpxWwnWz̫10?2k_|C7x}7wK%h/*GpAۏ$!yp6[ "wj_[g1Qr$ W7-iqP0%N!|s3 r5G:>Upp_W W=ڱ7ކ_HQ I[wX_7K,[ *$Hf';jL]\=C?j٭A@/dqH>|E ⮦d5!9؀=^;i>Q,DZ򹳱fq{#]< C ƴ3Xr}U%,d ϭhS&\dd jd۷#8k2y 67΁<I7h' $F'HVZ*&czXq/sʾ9wЦ[>Ƕʋp R)jMeӴZ=O.G{lӊ=t|Bv3iVxDJqW7+k]/p@y#slPIz12UN-ZN 'X -ȹ%gyg&+r I7|=O.m/Jy=r l{$4 -/=-4y@H!րZ]NokBS̞JݞSjt7)OgZ.g߉Ҷ\<5UHbG$ *m}&U)`7/iNEmJmS6 rW%m " +~3R.ssSuWFt(77!q9JDCC$,un} iu%\}{0Sʷ8|zxXU=a&/a<^D0.hGD9ZdSo}}l[>*ϴiW_6q5&NFSJ`e:5 3F#l)5BN$+['~^&RHf7g'5,oNJ4@bHZw˛cް#ES(rݟC}Qiˆ ҝ(8)T]i_1Hv񘎰4r-F}_h l4̻۩s8fI̙l>t++*ڷĊwf+J}{FF{&3;=f{̘ԗʢ̋1cxTvHU¼mr s%4tHavFTv}{(Y/)9+! D0&]vDsE ;1YZ~cN,/ILN,7 n1YHnQt Lp^ϿXt$DL%dX i "d7wj2V9)XRe ϨrߛBGqf)+ŀOx 3ˁ/hxWC9|aO׉de=txEN}Susl:cpyR0Q( 9 R'@ 3 :=\LbhtY%ȨHnAȍ3рK8n2JZZ2[8%?ynDr?pDU ;#jOw$+ !kWl;|=~U+]n^=>ImƧP]jݓ0+Zku5Ӫ>Ac;W@QK}0`.} o>58ԢpC&+ $)iC)HG县>xV 6N8a>Cf1\ٱƨLо3G=5<ͥcR16x)NVʜ%Q[g #8A\ ZU0s`]j(ȹ2?g1n~_Gh~rhP_{GI\a'c eH )!zCvw!12+fXyIeՅry_WL)(#fNIMrtWБ zC.tv2{o5go]iEw֫Nj1ajFF=6gȠ{̩ 95!2\sjO\aҩYQP=d"f+<)n,a0z5]kxE>T9H-4zx7K؆ 6hWt CCP)޼wWn$];j`F U*!hv ư*ܩ0m**'~:anИ;"i)gAZ$;x6W b޿5HjQ;ϕ< d CLpo~K8&l~L84F]ȈbxL%a'O oI3]$"7U"t僼0ս_bcb졠-{cbBL=&Mb-EZ7ݩ595?nҤ*ܟLECPWE'u0uB#vDI"=Ҫ-GѕOgG[-M6t:(ڠdS}:#Lۍp&yMml $q׭kNVJe e*On#TQ쬴B`D`(h\\iqĵR?7?߿ˏe֦WWn><|j^| I II]K,gzO*=_%ly} [!ce,Osr)rx0xبTdE$RpR7R{: FJ]sbDbZpr=-õR:Gёި8~HtjiAb!JK3w}ۨAWXxD吴_Kڿ]W-AnF {k=AJc"QdyoLD=N-zm-eŋ4^"Uƛ]`]ۿ L#1Ô\c'Y LB%Ig^ш?i\aPŐ0WTcɮKq0[e%&plN9׹=Z;[l#3 z{P8M+㼁y,#RbFdH8Cx*oɛk=JX ^<+yAJT+}1aI>AS tzj 8*E zgb %갡/SP Eyz`ŭ/Az/%^&a@E/RiBYKp8{f@BJ4_DxuC0yAgP#kX`RXj$mK !,;sJ3Is2?e>_K8|u÷||7vL(`~::~dδx86飇'3x)S?&|:|6tЍ&X0ƅ[K:~p'猫w p5A\ô.k`ǟ3@磥GzlM0d`8jM&\Q  Ee~ I' 'ݐ}:u0aF|&]E \2ɯj'o~|+qmLt8NkRvN_9m'NdU'] .omG0]GƎ ;Crsxg x@WjhT$I_ =6֓7Pa|v_ato >Ӑ[Y|Naط:,A3Hbz\Ww|֚~z9f._wϺ1! evIt1 iBz65(},W0oh8Kh}wcHh?Ϲ/P4x8m9u@Z3UZek9F7 hK )]'ZJO_b mh=]?EUr"?2ONߥQ3 Lu 9πWkzC\K1`{P8hz\9kn#}dYg-`a~ ón?G!^ J;~ QpPA ^N>z^]^_! AׁFJz#|,]sۛM*og'1BHp oQ|MA]-ij^ou6ԋ̙^1~._Afe% 1GBe^Qmab=/X_`$dzv~lx6.^}7m]|'@-D-^&Z~7:9BsCd(_Sl3i$Ѯ<)3CU,Z%x(NX0v{ap(6Q$P2}"3Eߥh9ldT0Yᝏ9C(s}ky̏_r~_x9xXV-z\ eUD=7BPtS D6_, (e3zNu:q u׌1U(ψJaFA9Wjk(/k,WERĻBScG>Oa"~ Xy3_ypVϕ +V4+]GW2ׁq8mV`Q`E5owגpri6&"^tD ar[֟N&"ˇ"q^e֠52h`?-?{}NHZrMȊȑGa\2-4sYmM x^FZR mrT.ʅBlUNhp( *0.%h,wrY&JXWf[wZl~g!"Mu%IߩNnd;Fnd1pva쬰 OIi'LI!1G7Wߺߺߙš-#BNsyk "[o<5Bn U[aMR?ũ\͔ˮTwUP0aLx|@ `RoT#uY" Cꆲ론4Ywj񢔫:#&[nKΒnl2 .TBpY|; uĶJnL@Mo]l7/@&wuN3] K?Fӕt¸*?>A=:^Yik]h@['|rr][%˺ik[sX^O9[97ί j ;uua888j[>mf50ĦOZb_}>j)V$_Bs:0+6B)[Q3 7iqڌtwJEW4u%BF($ۦb4WJE,$frU V)tr3&1D{Q1&Gā&CQ5Ǽ9) xjJ.ΠRh=JcQUvEhi.cVbD=@z5@F՜nZqPdp(U;yXHf哵^U>y޶hc8kF"-%%a^JLL0)1\CZD>*aW4R!FwÊrZ6Lp*G7E[;97F|)#In۔Q)ÄV*>ـ[l IVxKy"9EBC1{u Z] 3=`*~8h俏 6Yv:Ύe oh&eٞ(7(ۚnqXiB^prA^rNa|i0Ɖ`C(Eņh^@;|?(%3) /ZUb<1DEs `"pHYb1`QkdBql]9Т{ws"XRՊm o''G^:R$oAH@g#|N>Z(dn[!mƈ)0zn1|c\`// ') 7ѩ4KgT{s/UۂE [" XkXbw-ڄ~)Ú+Ƨ:[r-vB:0;Yh)2A+KiuDs%E(kkEZ( |o޸9+1nӥ:+eqiˬr,*EG" %0̉o* V2j|j Z bZsq5{Mr" MxY!pjji^N%'iUSN}{:!Um8| 3_yZUp1] BcRLhui`Ov;@&ZF{k@ -Y;Eb]@d9ld)p|g9#A.T>ׁ ޵qdBewm~$fgbL6yYB]mNdQCQċ=դMbu+0bKlvNթ:w@>m_$w1'4FnDrGn}u~?v~yC.`D'wk,\(6H V#r{giC{_fjBNgc Hb]چ>7~䳧O5d2zo Wk"ZdHˤ33L0pU)4r2—^&lL߼~]8ۻvjo1:iH&m<800J?2dI %:pPAFa˵˥^_"j1N/S /Rz%+'i$zk7rVFdΗ:yiT Á?䇠gac\ OB~|ኽ7.?VPO0l`m*l1-X},:II 1DW; E`$.)yhwP!tT]&i2]U^E5㗻b&.DmD'Fe%D(Q&F\Șw=!B<K"r ñOurGäT)ă&@8<k*ΰfp[S MlP-SC{%cBNK(SwiuݺE,Z:*}m:su#׾-Ʃ=1xͺۯGFc,(Oz^wSVHٕOgY$I˯O s-puN Z#V*>aaº —h͚R(s=:6y,[%YKr"v'KډfKVD+֔$+J~<{J5p1ֹ4KEѳw*_k9cU0(T0ϯp2srz#j1EH:tߎۏefOfnTru y{}]ijN8>2<)c0󏿕7P/#Xc2TI^)ł )##D͈-$[B_r ~vU}$e14#ɏtT-H"7X x_g<q)94>YJ,S_ЩuWb@u+XPd^ꄷ$.+uW~Csw7p)̏K5998N#S+.7(g[Čvg2ɲt^TA>cI2ig/VLU}5RԧL'}o IZ@4T0,CcSuYo1gø1C>4;!9m-;;~1*/9gvSl6)в>v`_8Od7c/;ieJ0Cx!e\ѩ<|ǀ1k}[̫HHPvFVUZWi5^x] k҆eD^qB#M4AfHHq9:5ɭV;PU^Qg}Ky506G6ԙP)ך@]NqRarfXDe6cA0F2ᛗ` Ap*Ő:cis,?s8p;P'Xc(x*8 OnaRA~,+jy`nbYw\,Y:j3M.Mfg~'EOzhV铜6ppeFGR8ZOgDcj5Q$ԡ+8J6L7KՓe70f$RE`!G+Hp=8O-ׅk sߐ t@cogƃ#|< zJ.b!lu* *dTeg2q4F,CO ‹ !Tl+g`VC_ͿoI|\Oij?%3=Œ&mu{R UϮfrqF7pY1 A/uRr$5y9e#~.e# ;R6x ĸ>;LRy1-w2{-o-BBN wJzfluy͍0uV NA"GY%K09X\JA_B8"Q b 7J9h4aguv89#T`ta7sޕūc1F"jg?Eo" %H/;;uJ:~) ?CJb1(t[q딤;5)9h嚗&E1ˢ5HpXGu^?K㻣S F֪: j>ť%b1`)!T2ΐh= ,F TbόcJ~`X@u \7Y8ILpqNuf9gQohVQGK:)ϟT~\iy)[3c =┭<2'lPpt 씭Ggv2zI*(Zl,J !v)G=kGRPPeuçM٪]h'2o!H42 iUfREFS. NJǵ`/%>z-p6h"5+Q*ՅXj6{Vs)Hp1$' 5# ^Ex85'Wn!mKea0"D0=Ձ޻`wGqY|0QY`MdƇ*494de((ڝuऋfwݦD]2_=ɩ3>NyTsJn0KjWvaR8NGe,6"h]cpD(|)7l-"޼c0mݸ/zZ8>k\ 4fBLjcF6]J̈́V7ohfFB1M62u50JjLO;`=_XZ'Mw|+ :Kz%aR.W:VSuLI/&wăj J6p2.)֘=5mfk`y'[ fR {K/%K.߬řc\6("roXL_e/\Cz9CGۏ__gpsvBsxkwiN~9as+ ?X-P %srH4j4XX=ӆE)u^yi[d <|he֝eA٘T j/,EﮒgF(HQ4YM[{Y~{?@կݽWJ_%W]@H])m1(X& Jr#F&Jkmwތ,d{~mzmr.y?gԷTNxe] x2ƗrkA\晞Vǚ)*E I 3#,̏yƵBF ~R[>zG"Ck}9.9!#_«IKD՝rۻW^׺!sT\m/UXPXhf>Ecβ%?a~{mѬWyWD+c\\x'kOHEF$1~:;~5_"2´aқFgBNEILhvXz3g][m98]N>6O?iQzDob$̚{/)k*@q y$ˉRs] iճEB!)ە *4^· 3T3P<QujhYۥpF,^F7#U7j("D܌&S&"9uΩd3X#fCFcJ2Mlì4i,qX 5b7U:/^;lL߼~0_KFHymI͒ Pn1P1];bH rf}0Z%HE=(ŗΥ:!ER`fz2v7}&|0 c1"}7]fM鴬}pn=վVUsJOGV_7wE/_^!Uj/}? S9"> EiN6-?>:㞂I wZ5"PJ_פX]]LWq`q5W'.o)q@0 -ᦢ δRet Lj.`U \K*X>L[nyX WDRY]xqpBYqB&c``+Iz/yTM}@p/H8:b& 넌B0 CYE,RH ʡ-DŽ^@$7TNNH%CoKhMKw6;ov?tvmQ~Q4=2Wc\ ~8סEOSߤikӻ; öwE0˜ޓ4d+zrbk۽RkF8b7`(P{6D:3m[6Ilq8˒qWkyC.Kg%Jm8O+Wo=I/.dʾ}ñlcr)[GTNhN O[WD[ht/:B(7ϝ`'d FϚJ(EvR-<mZTgnA}BQÝ}+𙠚5 j_2S~c52Z =R>[9UFAm?Z-6י/g\j\4Ȼ󥈂| y Ew me D #+ˠQIi] Q zn *F4o@ U谬B\0VEVrOM/Z\|?6 Zöl.jh`#F-c-<8pK>֗D F q^Img}@ ˪:uѣ}_xBBLt0`s>s#++;b?d]SWN5-Yc : ɥrCU‹5POOˠ պ ʠz_%#ƯWBᖠXn|W>:wSq g~Tƿqv/82@7 S\&نO}Ė]ya"͘%O0")!H`K#3[ ;Ni,UGNm OND.<Xx< |XCZ䆐OX"l؜/ ?쀁96_2R13GnJ'vΓuW jGaI1\-V^۹L;'^L@FO\HAL^r-5jk޿{ϐk=-nZTUqhZ`#mP^u 8xƈav}Nhj"Kgo-fw?j>3YB@JVE+ʯ1i_]G Vҟm)*#> +L*9{e%.9DƳ4(Es(٘Kˑ8z$4ƻ9Ek2?a&R>/mԎcI-kЉ8}D8A I㸲THo6i kmfL/,cMٱ6h~UCXs#6C̪!뮑Y#i`잙oLZC3eIܘz 9Y>5e7'HpfFXr}%,GFCcE͗2[ҧAꞮ']J4LHʬg²T0Bkt"!|vC pY8RGg aȟa'76`6V?8kdnJK*e; Ѡ 9Jof{CGsA'U"N^IG>eZ^C%х?j\|U&8{oQX C$"W, g879*3:w>3f1^53^ȷY ]Dlou^&j#chOUI0 ;3c' }Ajy ]x[a7#69EmQPC.Y2rϮϿ\Q|wEPKey*i|Kj#FRZBBfh&Aq1&3TLT&J:s6PǬNYVv~>^[3l#1s⨊h Li?-y,oFkl4{守 r9QJ)%EF@D)HK()EeR(o%Y)&fVQJrQm,H@w?dTTbpg. 8&\zQgmebUؐrOy)IB"yL* < ًI`ϙx>:397sjTFېCգoz Ruyi$ל1?cWOשx㪇kǘey+rtc̏zX `V``0YoBz Xr S@ݜ/yX XU0It~zln« y0`>%U V@Lt>T!GC_2ބ=UK{-pfKx™̼ ͍Az xMs2k䱦)'H^%mM:S'E u Ns&J7ׇꇉYKS5_4H-#K| ̂hv1+aLvٜ<˗ ڑ$-tgjnTQxy9 9s."gX4',dFE29 8,ɑZ4[|;#ilEŨꞩ)]}F-WYB4F#muG-h`,}e.<<9х@dH#!Ib90#һg$%\s]="im9"imBT͑FB cԂi5"i-Kz)gJ `#XeTpm~1x\1A_L_Qxb@.j̔@A*_H1n]+B_KbibdcmVYZd_P=u'%v_,~у[c@q JHEMIp/%;oW'U?IZ#2P ,scDU,~֔815(gۧ;,f5T^ ܿ0HױJV@Zx.[}%#YY\Ũ\b %J~3 *P%s8=,Db#r'eIH}!MΞ 0ZwAi*Iݫ >Jc"ʖDc&e$.R_ttg ^"+.c<C>{ٻ6$Wzٝ)CzpH Ǯg뵟4SI0 d7EZ l5ʣ̐=G?m >Wq؇i|+?H Kɘr_V'UcGAFMƟVQf*%‚OJ"t\W5z ;US9`mgL=#:O 2#+0 ajnrh4Wlj8L>Z>eo-7U3~>[ _svmi7: 32y}rz KokXjB4׌`O]oÍŝL.VvS[pon}X^Jl@0OYU\V64Ҋ OQw# XpZ.FAM.Aptu޼MCP x:}۲~L󀿇E\?ëCkz;?[e`y;6^I//aH4_UEuwSo]`޴FլԷ_ r}E {K>rXiJJL1-*\HC)[IkS߰n3y ֭-Nw4n}L+{PnkԴn]h3WV:#Λ ;֖uT;X"TjmiݺАg;:UC#Ҷ.WorU \J!SLW72NxtlD ,S|bډC-mMU! !(-O#S+. $ryymeh=ˮ"iъzlQX63h BSk ݢA s9Q$~.Qu9ACJERKmrdߣ5S_#.9\$4e?)LQ 1Z,@#ˑH99x\gEѳ=K9*Q9e.'#jLx_='n4=jkEYN92h D)R4-2ҖŘ PAk EֲjW.]vw!mZ)hVR \;)"tt.6sX!-B'i ٔ3Bf!+謒#l&1>lk^eJuЗt^:ڔZtt, )Kqq-:Ur $*}B+-U/-7nf'yKs$F"eB*$Øe[nx,DizN57~32@|>(k5\j%3A#{VýmYk[rqdyCMYr{=Tokq-N޵ٸjQ)wc2Hq#QМ ʹW:c9; L\8QM 99CTYf|Em` 3h\$wlI`'fhY“"]*)4&!` 9 *(X-G3HC=;&N)}VWAz!syH. |H#ef* '\>_o/bZIi~IDFjgSz Lf5iV?γNX{A\#.m)[Cy/q|9rhP5L2 Dt*MLVdMhr{$_i IS Gzp f];.0bhɜ#FAU̩6PlBy qFta&XAŋFӒ?uqń/chF|dR8ʁ|b`/|#0ܔ$>;JX¬T^\@[[GFA.}T.G "^ B>mVV>gRSGQk1#H&QC xc,sJнtmوmoG' O]5iJ;hݻ7EkLBHZ\i=Zj o d,?FIi'*A9qRqRMڢRG/f{^MIUי8x󯿤gǗ奏aH>\6BR;ro?RvU;TVP(]BfăOf_R He#%Aͮ򂮺wOCi0 YRK$sօ4GWhWJOn6?%ɫMY+gaxe2b7WmpIabH4 #<_ۨ@`dѯ?F;s q Lgx]1)_ŝg/)\afg?,E,eCHR4a+.L\i>'˙qym6F<MNtd'b֒H RCKC N(9,MʇWwqsI͏ u^W&Hc]bcf~BkxQ2T؂kA~ڜ), ů l'Xi;V`6eg%Iv::YA*/]64,h1vj̲I rb.`.\#jQ tiᵸ]8)#Td\ m3H\Ka6͹<R.T"ٹG&e>(+YAiVVp\jB!q{@Iހ,WU o Ѹ&7{7<D?iz/V/5kxCd ?/mÕWg7wqw_ 3źJLKT:xR-N҇AmG|ȥDȒMQ, f5Jl\oMH2 OZkWg;RC 뵲{ۦJ_H-TMU=n+&xkE"X$X HF+#:_+XɍQ)lz\f} UV[Բ}jƉ}NCB^r>'WSztw<`-[_-8(ueCL#oMlPޖ]ُn~TG'o^sU"WCAH՝˫lA5438m@(Nڨ0C%hbSvrߪ9eKkr??1>]`ѯZU}8-n;J2؃u$uoVxUUJWpʿt`4yRGd Ty@e;3UM]vढ़ô|G-~x\d.{3`=wusL6u>9`#0nijOF=>6-fFZ)ZH7\0\\y๝ sZ56T+a5%hl+~̙T ңיTҁU,'YKw3;3.^,^pD+%zyaEr_vD,z*g̓7qYqS~= ୫oV쭖`+{`!{J4i9J*NurjB9J!~b[G^.!X4\.60 4mgz$&/J>K:ҿ1]qLSx]еEtos; )Gn`LDɰaF: ufUtA*LWp&`#yDC¥fp}E+( 6Xq ޮǭgtK\BI0ϘQ&cڈ,VeJK@َw|Rf6M0T"/tqM;HΣ(H$Br=^/IɔN1D sk C$jS y*ENQJdIP8 4ƱKZN/*o ƻ%jY#P(eX К讒HMdBHӛ΃ !^ZF%;~ )~=*jO-[.j3:"rVy#~\緿?Wawu{6`k拥`U۾SYY/]Qˉ#.bu *j1B&Tx"{s+0m P D*m<0K(x>ޭ9Rĕ Iq&;ysHߋkوx$\!~9uj"Vp(#{R`xp6G U98xGTO9Q@V0Ikft}ڒ>O:ݷT{4w &LxQ5@>^I +*6{No[iQA.zid%j(gaVW~s3~niK[û=IKS_+_{˒ܕ~zp)n'|$*Qlm|np)n0{RJU]4A:3ZW@ [ D~^kq8݉jIn Az3S-A硏+#ιFbJBYiV|YiJCR쏼var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005275176415157257607017735 0ustar rootrootMar 20 13:23:03 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:23:04 crc restorecon[4758]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:23:04 crc restorecon[4758]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:23:05 crc kubenswrapper[4856]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.548956 4856 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554098 4856 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554118 4856 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554123 4856 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554128 4856 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554133 4856 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554138 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554143 4856 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554147 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554153 4856 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554160 4856 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554166 4856 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554173 4856 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554178 4856 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554190 4856 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554194 4856 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554198 4856 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554201 4856 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554205 4856 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554208 4856 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554213 4856 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554217 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554222 4856 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554226 4856 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554230 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554234 4856 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554238 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554241 4856 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554245 4856 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554248 4856 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554252 4856 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554256 4856 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554260 4856 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554266 4856 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554292 4856 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554298 4856 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554303 4856 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554307 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554313 4856 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554318 4856 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554322 4856 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554326 4856 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554329 4856 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554332 4856 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554350 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554354 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554359 4856 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554363 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554367 4856 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554371 4856 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554377 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554381 4856 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554385 4856 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554390 4856 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554394 4856 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554398 4856 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554403 4856 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554407 4856 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554411 4856 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554416 4856 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554420 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554424 4856 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554428 4856 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554435 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554439 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554443 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554448 4856 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554452 4856 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554456 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554461 4856 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554465 4856 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.554472 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555163 4856 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555176 4856 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555185 4856 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555191 4856 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555196 4856 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555201 4856 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555206 4856 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555211 4856 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555215 4856 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555219 4856 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555224 4856 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555228 4856 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555232 4856 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555236 4856 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555241 4856 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555245 4856 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555249 4856 flags.go:64] FLAG: --cloud-config="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555253 4856 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555257 4856 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555262 4856 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555271 4856 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555289 4856 flags.go:64] FLAG: --config-dir="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555293 4856 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555298 4856 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555303 4856 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555307 4856 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555311 4856 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555316 4856 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555320 4856 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555324 4856 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555328 4856 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555332 4856 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555336 4856 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555342 4856 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555346 4856 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555350 4856 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555354 4856 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555358 4856 flags.go:64] FLAG: --enable-server="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555362 4856 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555374 4856 flags.go:64] FLAG: --event-burst="100" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555378 4856 flags.go:64] FLAG: --event-qps="50" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555382 4856 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555386 4856 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555390 4856 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555395 4856 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555399 4856 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555403 4856 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555407 4856 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555411 4856 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555415 4856 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555420 4856 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555424 4856 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555428 4856 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555432 4856 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555436 4856 flags.go:64] FLAG: --feature-gates="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555441 4856 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555445 4856 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555450 4856 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555458 4856 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555463 4856 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555467 4856 flags.go:64] FLAG: --help="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555471 4856 flags.go:64] FLAG: --hostname-override="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555475 4856 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555479 4856 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555484 4856 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555489 4856 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555495 4856 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555500 4856 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555505 4856 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555510 4856 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555516 4856 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555521 4856 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555526 4856 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555531 4856 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555536 4856 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555541 4856 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555547 4856 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555551 4856 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555556 4856 flags.go:64] FLAG: --lock-file="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555560 4856 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555564 4856 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555569 4856 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555579 4856 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555590 4856 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555595 4856 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555600 4856 flags.go:64] FLAG: --logging-format="text" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555606 4856 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555612 4856 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555617 4856 flags.go:64] FLAG: --manifest-url="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555621 4856 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555627 4856 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555632 4856 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555637 4856 flags.go:64] FLAG: --max-pods="110" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555641 4856 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555646 4856 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555650 4856 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555654 4856 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555659 4856 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555663 4856 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555668 4856 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555678 4856 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555682 4856 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555686 4856 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555690 4856 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555694 4856 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555700 4856 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555704 4856 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555708 4856 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555712 4856 flags.go:64] FLAG: --port="10250" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555717 4856 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555720 4856 flags.go:64] FLAG: --provider-id="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555724 4856 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555729 4856 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555733 4856 flags.go:64] FLAG: --register-node="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555737 4856 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555741 4856 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555749 4856 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555753 4856 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555757 4856 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555761 4856 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555766 4856 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555770 4856 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555776 4856 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555780 4856 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555784 4856 flags.go:64] FLAG: --runonce="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555788 4856 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555792 4856 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555796 4856 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555800 4856 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555804 4856 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555809 4856 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555813 4856 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555817 4856 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555821 4856 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555825 4856 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555829 4856 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555833 4856 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555838 4856 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555842 4856 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555846 4856 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555852 4856 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555856 4856 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555860 4856 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555864 4856 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555868 4856 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555873 4856 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555877 4856 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555881 4856 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555885 4856 flags.go:64] FLAG: --v="2" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555890 4856 flags.go:64] FLAG: --version="false" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555895 4856 flags.go:64] FLAG: --vmodule="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555900 4856 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.555904 4856 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556000 4856 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556005 4856 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556010 4856 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556015 4856 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556019 4856 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556024 4856 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556029 4856 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556033 4856 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556039 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556043 4856 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556047 4856 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556051 4856 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556056 4856 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556060 4856 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556067 4856 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556071 4856 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556076 4856 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556079 4856 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556083 4856 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556086 4856 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556090 4856 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556094 4856 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556097 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556101 4856 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556105 4856 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556109 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556113 4856 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556117 4856 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556120 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556125 4856 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556129 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556133 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556136 4856 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556140 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556144 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556147 4856 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556151 4856 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556155 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556159 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556163 4856 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556167 4856 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556172 4856 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556175 4856 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556180 4856 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556183 4856 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556186 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556190 4856 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556193 4856 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556197 4856 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556204 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556208 4856 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556211 4856 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556214 4856 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556218 4856 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556223 4856 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556228 4856 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556232 4856 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556236 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556239 4856 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556243 4856 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556247 4856 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556250 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556254 4856 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556257 4856 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556261 4856 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556264 4856 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556286 4856 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556290 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556293 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556297 4856 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.556302 4856 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.556314 4856 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.571130 4856 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.571191 4856 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571371 4856 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571397 4856 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571407 4856 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571417 4856 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571426 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571435 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571444 4856 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571455 4856 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571468 4856 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571478 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571487 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571495 4856 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571504 4856 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571513 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571522 4856 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571530 4856 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571539 4856 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571547 4856 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571555 4856 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571563 4856 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571599 4856 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571607 4856 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571615 4856 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571624 4856 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571631 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571639 4856 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571647 4856 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571658 4856 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571670 4856 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571682 4856 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571692 4856 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571702 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571711 4856 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571719 4856 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571727 4856 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571735 4856 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571743 4856 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571751 4856 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571759 4856 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571766 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571774 4856 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571781 4856 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571789 4856 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571796 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571804 4856 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571811 4856 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571819 4856 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571829 4856 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571838 4856 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571848 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571859 4856 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571868 4856 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571877 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571885 4856 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571894 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571904 4856 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571912 4856 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571922 4856 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571930 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571938 4856 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571945 4856 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571953 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571960 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571968 4856 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571976 4856 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571984 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571991 4856 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.571999 4856 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572006 4856 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572014 4856 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572022 4856 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.572036 4856 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572293 4856 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572308 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572320 4856 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572330 4856 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572338 4856 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572347 4856 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572355 4856 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572364 4856 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572373 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572381 4856 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572389 4856 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572396 4856 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572404 4856 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572412 4856 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572420 4856 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572427 4856 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572435 4856 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572443 4856 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572451 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572461 4856 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572471 4856 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572479 4856 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572488 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572496 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572504 4856 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572512 4856 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572520 4856 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572528 4856 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572538 4856 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572547 4856 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572555 4856 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572564 4856 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572573 4856 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572581 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572590 4856 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572597 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572605 4856 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572613 4856 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572622 4856 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572632 4856 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572640 4856 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572647 4856 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572656 4856 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572663 4856 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572672 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572681 4856 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572690 4856 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572698 4856 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572706 4856 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572715 4856 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572722 4856 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572730 4856 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572738 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572745 4856 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572753 4856 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572761 4856 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572769 4856 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572777 4856 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572784 4856 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572792 4856 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572800 4856 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572807 4856 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572815 4856 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572823 4856 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572831 4856 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572839 4856 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572846 4856 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572854 4856 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572861 4856 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572869 4856 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.572876 4856 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.572908 4856 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.573190 4856 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.577959 4856 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.582183 4856 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.582340 4856 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.584256 4856 server.go:997] "Starting client certificate rotation" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.584310 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.584449 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.610596 4856 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.613026 4856 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.614413 4856 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.632306 4856 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.678528 4856 log.go:25] "Validated CRI v1 image API" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.681198 4856 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.686992 4856 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-18-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.687040 4856 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:46 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.717889 4856 manager.go:217] Machine: {Timestamp:2026-03-20 13:23:05.713612768 +0000 UTC m=+0.594638968 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:2699e498-35c4-4151-a88c-336d77e1ef57 BootID:1419ae1d-c71a-4a60-94aa-534c0802748c Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:46 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:94:81:4e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:94:81:4e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:0b:ab:79 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3e:4d:63 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ab:3f:f1 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:3e:02:ef Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:a3:cd:59 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:1a:6f:84:0b:04:cb Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:da:ab:5d:85:f9:6a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.718296 4856 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.718515 4856 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.720384 4856 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.720727 4856 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.720781 4856 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.721348 4856 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.721370 4856 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.721951 4856 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.722025 4856 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.722299 4856 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.722457 4856 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.726070 4856 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.726105 4856 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.726144 4856 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.726167 4856 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.726184 4856 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.732002 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.732021 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.732103 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.732129 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.734174 4856 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.735161 4856 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.736755 4856 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738541 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738586 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738605 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738619 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738641 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738655 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738669 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738693 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738708 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738721 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738792 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.738808 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.739852 4856 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.740570 4856 server.go:1280] "Started kubelet" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.741548 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.741763 4856 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.741782 4856 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.742645 4856 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:23:05 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.745592 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.745641 4856 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.747563 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.747673 4856 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.747767 4856 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.747815 4856 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.748220 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.748315 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.748258 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.748521 4856 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.747771 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8f6951fe84b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,LastTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.753003 4856 factory.go:55] Registering systemd factory Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.753058 4856 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.754023 4856 factory.go:153] Registering CRI-O factory Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.754066 4856 factory.go:221] Registration of the crio container factory successfully Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.754384 4856 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.754442 4856 factory.go:103] Registering Raw factory Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.754495 4856 manager.go:1196] Started watching for new ooms in manager Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.755537 4856 manager.go:319] Starting recovery of all containers Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762061 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762119 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762139 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762186 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762202 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762214 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762225 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762238 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762251 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762262 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762301 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762319 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762334 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762350 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762366 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762383 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762412 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762429 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762444 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762459 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762473 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762488 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762507 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762522 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762612 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762628 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762649 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762668 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762686 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762701 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762715 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762730 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762746 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762762 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762777 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762793 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762808 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762821 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762834 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762846 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762866 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762878 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762890 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762904 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762917 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762929 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762943 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762958 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762970 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762982 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.762995 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763007 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763025 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763039 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763052 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763066 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763080 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763093 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763105 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763118 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763150 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763162 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763174 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763187 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763198 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763210 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763223 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763234 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.763247 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765144 4856 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765170 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765184 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765196 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765207 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765219 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765230 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765241 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765254 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765288 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765303 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765316 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765328 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765340 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765352 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765364 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765376 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765388 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765399 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765411 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765423 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765435 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765447 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765458 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765469 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765480 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765494 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765506 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765517 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765528 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765541 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765551 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765562 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765573 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765584 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765595 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765611 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765623 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765637 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765649 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765663 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765675 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765688 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765701 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765712 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765725 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765737 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765749 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765760 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765773 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765784 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765795 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765808 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765819 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765831 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765842 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765853 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765864 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765878 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765892 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765907 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765923 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765940 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765957 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765969 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765982 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.765993 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766005 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766015 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766026 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766037 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766048 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766059 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766069 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766081 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766092 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766103 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766114 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766124 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766135 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766146 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766166 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766176 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766187 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766198 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766208 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766218 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766229 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766241 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766252 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766286 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766299 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766309 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766321 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766332 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766345 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766357 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766369 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766381 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766392 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766422 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766439 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766454 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766469 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766484 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766496 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766509 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766521 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766532 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766544 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766555 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766601 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766613 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766628 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766642 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766654 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766665 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766677 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766688 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766700 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766712 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766723 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766736 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766747 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766759 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766772 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766783 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766794 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766805 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766817 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766829 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766840 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766852 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766864 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766876 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766888 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766900 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766912 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766924 4856 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766935 4856 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.766943 4856 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.794630 4856 manager.go:324] Recovery completed Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.815169 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.815346 4856 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.817193 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.817523 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.817545 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.817972 4856 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.818485 4856 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.818537 4856 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.818605 4856 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.818742 4856 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.818772 4856 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.818799 4856 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:23:05 crc kubenswrapper[4856]: W0320 13:23:05.820825 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.820951 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.840901 4856 policy_none.go:49] "None policy: Start" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.842249 4856 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.842316 4856 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.847789 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.905934 4856 manager.go:334] "Starting Device Plugin manager" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906013 4856 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906029 4856 server.go:79] "Starting device plugin registration server" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906485 4856 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906521 4856 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906725 4856 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906818 4856 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.906827 4856 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.915633 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.919199 4856 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.919392 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.920608 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.920753 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.920873 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.921132 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.921393 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.921438 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922307 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922336 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922347 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922416 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922437 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922571 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922710 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.922740 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923420 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923442 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923452 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923551 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923798 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923892 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.923953 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924059 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924143 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924201 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924216 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924355 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924555 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924630 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924776 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924805 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924815 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.924990 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925010 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925019 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925188 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925216 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925675 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925709 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925722 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925905 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925933 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.925944 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4856]: E0320 13:23:05.949075 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970443 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970542 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970591 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970645 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970694 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970742 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970784 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970842 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970898 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970927 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.970963 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.971011 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.971062 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.971116 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:05 crc kubenswrapper[4856]: I0320 13:23:05.971164 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.006753 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.008491 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.008531 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.008543 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.008572 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.009126 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073027 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073075 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073149 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073171 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073191 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073230 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073242 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073345 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073352 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073366 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073432 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073466 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073497 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073526 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073555 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073579 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073557 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073618 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073632 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073576 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073588 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073629 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073703 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073738 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073745 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073768 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073799 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073826 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073867 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.073983 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.074053 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.122100 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-1b1a08b7b304cc2f8d9826c3f7f3d14cf13de513af33076cf14c8992ef47cc1d WatchSource:0}: Error finding container 1b1a08b7b304cc2f8d9826c3f7f3d14cf13de513af33076cf14c8992ef47cc1d: Status 404 returned error can't find the container with id 1b1a08b7b304cc2f8d9826c3f7f3d14cf13de513af33076cf14c8992ef47cc1d Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.209804 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.211420 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.211461 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.211474 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.211497 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.211951 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.275009 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.287465 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-6892c529dbd0f8c4b23680fbd8988ebcaa0ee739a99beaf3e2c4115386180462 WatchSource:0}: Error finding container 6892c529dbd0f8c4b23680fbd8988ebcaa0ee739a99beaf3e2c4115386180462: Status 404 returned error can't find the container with id 6892c529dbd0f8c4b23680fbd8988ebcaa0ee739a99beaf3e2c4115386180462 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.289874 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.312929 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f8933bf83eb2f276818390ade54ffd6b9c797090567c6303dedf30a032a2d055 WatchSource:0}: Error finding container f8933bf83eb2f276818390ade54ffd6b9c797090567c6303dedf30a032a2d055: Status 404 returned error can't find the container with id f8933bf83eb2f276818390ade54ffd6b9c797090567c6303dedf30a032a2d055 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.335645 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.350704 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.353863 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-bb597c4bea46e1833dd38d6fa5e43891479f17cd34a6b65078567e1c8fd72043 WatchSource:0}: Error finding container bb597c4bea46e1833dd38d6fa5e43891479f17cd34a6b65078567e1c8fd72043: Status 404 returned error can't find the container with id bb597c4bea46e1833dd38d6fa5e43891479f17cd34a6b65078567e1c8fd72043 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.363440 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.380779 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-150b0e75003610a778bf6c0221848ac4a68ec9036f536b06b18e0db204b5c835 WatchSource:0}: Error finding container 150b0e75003610a778bf6c0221848ac4a68ec9036f536b06b18e0db204b5c835: Status 404 returned error can't find the container with id 150b0e75003610a778bf6c0221848ac4a68ec9036f536b06b18e0db204b5c835 Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.557446 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.557557 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.612989 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.614378 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.614602 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.614623 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.614667 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.615312 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 20 13:23:06 crc kubenswrapper[4856]: W0320 13:23:06.718620 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:06 crc kubenswrapper[4856]: E0320 13:23:06.718766 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.743254 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.825741 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.825876 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"150b0e75003610a778bf6c0221848ac4a68ec9036f536b06b18e0db204b5c835"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.828073 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379" exitCode=0 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.828145 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.828173 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bb597c4bea46e1833dd38d6fa5e43891479f17cd34a6b65078567e1c8fd72043"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.828252 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.829435 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.829472 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.829482 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.830516 4856 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda" exitCode=0 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.830570 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.830595 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f8933bf83eb2f276818390ade54ffd6b9c797090567c6303dedf30a032a2d055"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.830670 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.830722 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831496 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831514 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831524 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831569 4856 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801" exitCode=0 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831612 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831629 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6892c529dbd0f8c4b23680fbd8988ebcaa0ee739a99beaf3e2c4115386180462"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831680 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831866 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831877 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.831886 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.833378 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.833406 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.833418 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.834319 4856 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e" exitCode=0 Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.834344 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.834367 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1b1a08b7b304cc2f8d9826c3f7f3d14cf13de513af33076cf14c8992ef47cc1d"} Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.834426 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.835141 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.835154 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4856]: I0320 13:23:06.835161 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: E0320 13:23:07.152222 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 20 13:23:07 crc kubenswrapper[4856]: W0320 13:23:07.196782 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:07 crc kubenswrapper[4856]: E0320 13:23:07.196855 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:07 crc kubenswrapper[4856]: W0320 13:23:07.302511 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:07 crc kubenswrapper[4856]: E0320 13:23:07.302603 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.420836 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.423334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.423399 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.423409 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.423432 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:07 crc kubenswrapper[4856]: E0320 13:23:07.423940 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.192:6443: connect: connection refused" node="crc" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.711574 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:23:07 crc kubenswrapper[4856]: E0320 13:23:07.712262 4856 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.192:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.742282 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.840828 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.840883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.840895 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.840893 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.841623 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.841665 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.841676 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.844092 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.844126 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.844139 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.844150 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.847310 4856 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9" exitCode=0 Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.847365 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.847523 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.849613 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.849654 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.849665 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.850562 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.850681 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.853659 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.853685 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.853695 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.856633 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.856665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.856676 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd"} Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.856755 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.857511 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.857547 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.857560 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4856]: I0320 13:23:07.986398 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.861550 4856 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc" exitCode=0 Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.861642 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc"} Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.862086 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.864552 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.864586 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.864600 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.869101 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"db54a51d5a70e941c6d7f34e9fbc67e7b1d3beb4b0eedfd7baf828eb64e8f651"} Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.869142 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.869217 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.869952 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.869988 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.870003 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.873779 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.873865 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4856]: I0320 13:23:08.873891 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.024094 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.025807 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.025842 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.025852 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.025875 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.415767 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.424447 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.875901 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f"} Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.875957 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.875979 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975"} Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.876009 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.876010 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2"} Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.876037 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b"} Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.876045 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877518 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877557 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877622 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877628 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877651 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4856]: I0320 13:23:09.877665 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.883951 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d"} Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.884106 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.884178 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.885402 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.885445 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.885463 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.886346 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.886405 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4856]: I0320 13:23:10.886428 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.295395 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.370126 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.458883 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.459082 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.459134 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.460698 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.460762 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4856]: I0320 13:23:11.460789 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.083767 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.085114 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.088840 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.088880 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.088924 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.088883 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.093997 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.094115 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.094160 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.097834 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.098064 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.098216 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.098345 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.098372 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.098433 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4856]: I0320 13:23:12.918741 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.091575 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.093200 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.093304 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.093330 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.893060 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.893474 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.895356 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.895510 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4856]: I0320 13:23:13.895531 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4856]: I0320 13:23:14.094232 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:14 crc kubenswrapper[4856]: I0320 13:23:14.095885 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4856]: I0320 13:23:14.095955 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4856]: I0320 13:23:14.095978 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4856]: E0320 13:23:15.915892 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.514835 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.515125 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.516956 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.517025 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.517043 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4856]: I0320 13:23:16.519442 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.101783 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.103259 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.103364 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.103389 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.808490 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.808792 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.810482 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.810549 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4856]: I0320 13:23:17.810569 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4856]: W0320 13:23:18.711017 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:23:18 crc kubenswrapper[4856]: I0320 13:23:18.711157 4856 trace.go:236] Trace[186133760]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:23:08.709) (total time: 10002ms): Mar 20 13:23:18 crc kubenswrapper[4856]: Trace[186133760]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:23:18.711) Mar 20 13:23:18 crc kubenswrapper[4856]: Trace[186133760]: [10.002084408s] [10.002084408s] END Mar 20 13:23:18 crc kubenswrapper[4856]: E0320 13:23:18.711177 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:23:18 crc kubenswrapper[4856]: I0320 13:23:18.744342 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:23:18 crc kubenswrapper[4856]: E0320 13:23:18.753837 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 20 13:23:18 crc kubenswrapper[4856]: W0320 13:23:18.914174 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:23:18 crc kubenswrapper[4856]: I0320 13:23:18.914284 4856 trace.go:236] Trace[152809936]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:23:08.912) (total time: 10001ms): Mar 20 13:23:18 crc kubenswrapper[4856]: Trace[152809936]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:23:18.914) Mar 20 13:23:18 crc kubenswrapper[4856]: Trace[152809936]: [10.001251866s] [10.001251866s] END Mar 20 13:23:18 crc kubenswrapper[4856]: E0320 13:23:18.914310 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:23:19 crc kubenswrapper[4856]: W0320 13:23:19.021026 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.021107 4856 trace.go:236] Trace[1393514191]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:23:09.019) (total time: 10001ms): Mar 20 13:23:19 crc kubenswrapper[4856]: Trace[1393514191]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:23:19.021) Mar 20 13:23:19 crc kubenswrapper[4856]: Trace[1393514191]: [10.001767039s] [10.001767039s] END Mar 20 13:23:19 crc kubenswrapper[4856]: E0320 13:23:19.021126 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:23:19 crc kubenswrapper[4856]: E0320 13:23:19.026461 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.108028 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.110258 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db54a51d5a70e941c6d7f34e9fbc67e7b1d3beb4b0eedfd7baf828eb64e8f651" exitCode=255 Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.110327 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"db54a51d5a70e941c6d7f34e9fbc67e7b1d3beb4b0eedfd7baf828eb64e8f651"} Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.110513 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.111410 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.111476 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.111496 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.112355 4856 scope.go:117] "RemoveContainer" containerID="db54a51d5a70e941c6d7f34e9fbc67e7b1d3beb4b0eedfd7baf828eb64e8f651" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.515638 4856 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.515749 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:23:19 crc kubenswrapper[4856]: W0320 13:23:19.678931 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.679051 4856 trace.go:236] Trace[1929159850]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:23:09.677) (total time: 10001ms): Mar 20 13:23:19 crc kubenswrapper[4856]: Trace[1929159850]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:23:19.678) Mar 20 13:23:19 crc kubenswrapper[4856]: Trace[1929159850]: [10.001857702s] [10.001857702s] END Mar 20 13:23:19 crc kubenswrapper[4856]: E0320 13:23:19.679084 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:23:19 crc kubenswrapper[4856]: E0320 13:23:19.889028 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:19Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f6951fe84b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,LastTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:19 crc kubenswrapper[4856]: E0320 13:23:19.893942 4856 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.895927 4856 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.895994 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.900874 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:19Z is after 2026-02-23T05:33:13Z Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.905080 4856 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:23:19 crc kubenswrapper[4856]: I0320 13:23:19.905161 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.115499 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.117566 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3"} Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.117757 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.118835 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.118872 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.118887 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:20 crc kubenswrapper[4856]: I0320 13:23:20.746010 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2026-02-23T05:33:13Z Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.122431 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.123103 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.125202 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" exitCode=255 Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.125295 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3"} Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.125365 4856 scope.go:117] "RemoveContainer" containerID="db54a51d5a70e941c6d7f34e9fbc67e7b1d3beb4b0eedfd7baf828eb64e8f651" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.125620 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.126839 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.126879 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.126895 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.127628 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:21 crc kubenswrapper[4856]: E0320 13:23:21.127913 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.468415 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.747341 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2026-02-23T05:33:13Z Mar 20 13:23:21 crc kubenswrapper[4856]: I0320 13:23:21.943146 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:21 crc kubenswrapper[4856]: E0320 13:23:21.959630 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.132018 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.135533 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.136883 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.136941 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.136960 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.137860 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:22 crc kubenswrapper[4856]: E0320 13:23:22.138136 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.226862 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.228358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.228422 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.228450 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.228489 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:22 crc kubenswrapper[4856]: E0320 13:23:22.233326 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.748236 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:22Z is after 2026-02-23T05:33:13Z Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.853380 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.955751 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.955992 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.960869 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.960933 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.960958 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:22 crc kubenswrapper[4856]: I0320 13:23:22.985215 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.138804 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.138863 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.140182 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.140233 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.140250 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.141983 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.142048 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.142065 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.142302 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:23 crc kubenswrapper[4856]: E0320 13:23:23.142588 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:23 crc kubenswrapper[4856]: W0320 13:23:23.377975 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:23Z is after 2026-02-23T05:33:13Z Mar 20 13:23:23 crc kubenswrapper[4856]: E0320 13:23:23.378061 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.747710 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:23Z is after 2026-02-23T05:33:13Z Mar 20 13:23:23 crc kubenswrapper[4856]: I0320 13:23:23.893701 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:23 crc kubenswrapper[4856]: W0320 13:23:23.967068 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:23Z is after 2026-02-23T05:33:13Z Mar 20 13:23:23 crc kubenswrapper[4856]: E0320 13:23:23.967186 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.141187 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.142479 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.142538 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.142556 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.143470 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:24 crc kubenswrapper[4856]: E0320 13:23:24.143774 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:24 crc kubenswrapper[4856]: W0320 13:23:24.248668 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:24Z is after 2026-02-23T05:33:13Z Mar 20 13:23:24 crc kubenswrapper[4856]: E0320 13:23:24.248770 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:24 crc kubenswrapper[4856]: W0320 13:23:24.410143 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:24Z is after 2026-02-23T05:33:13Z Mar 20 13:23:24 crc kubenswrapper[4856]: E0320 13:23:24.410252 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:24 crc kubenswrapper[4856]: I0320 13:23:24.747595 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:24Z is after 2026-02-23T05:33:13Z Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.144021 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.146183 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.146478 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.146662 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.147626 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:25 crc kubenswrapper[4856]: E0320 13:23:25.148060 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:25 crc kubenswrapper[4856]: I0320 13:23:25.745476 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2026-02-23T05:33:13Z Mar 20 13:23:25 crc kubenswrapper[4856]: E0320 13:23:25.916029 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:26 crc kubenswrapper[4856]: I0320 13:23:26.747501 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:26Z is after 2026-02-23T05:33:13Z Mar 20 13:23:27 crc kubenswrapper[4856]: I0320 13:23:27.747938 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:27Z is after 2026-02-23T05:33:13Z Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.334371 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:23:28 crc kubenswrapper[4856]: E0320 13:23:28.341042 4856 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:28 crc kubenswrapper[4856]: E0320 13:23:28.365673 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.634103 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.635554 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.635610 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.635628 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.635658 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:28 crc kubenswrapper[4856]: E0320 13:23:28.640340 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:23:28 crc kubenswrapper[4856]: I0320 13:23:28.747055 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:28Z is after 2026-02-23T05:33:13Z Mar 20 13:23:29 crc kubenswrapper[4856]: I0320 13:23:29.515862 4856 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:23:29 crc kubenswrapper[4856]: I0320 13:23:29.515951 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:23:29 crc kubenswrapper[4856]: I0320 13:23:29.748782 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:29Z is after 2026-02-23T05:33:13Z Mar 20 13:23:29 crc kubenswrapper[4856]: E0320 13:23:29.892079 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f6951fe84b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,LastTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:30 crc kubenswrapper[4856]: I0320 13:23:30.746409 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2026-02-23T05:33:13Z Mar 20 13:23:30 crc kubenswrapper[4856]: W0320 13:23:30.988635 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2026-02-23T05:33:13Z Mar 20 13:23:30 crc kubenswrapper[4856]: E0320 13:23:30.988799 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:31 crc kubenswrapper[4856]: I0320 13:23:31.748741 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2026-02-23T05:33:13Z Mar 20 13:23:32 crc kubenswrapper[4856]: I0320 13:23:32.747531 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2026-02-23T05:33:13Z Mar 20 13:23:33 crc kubenswrapper[4856]: W0320 13:23:33.113260 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z Mar 20 13:23:33 crc kubenswrapper[4856]: E0320 13:23:33.113409 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:33 crc kubenswrapper[4856]: W0320 13:23:33.209613 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z Mar 20 13:23:33 crc kubenswrapper[4856]: E0320 13:23:33.209755 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:33 crc kubenswrapper[4856]: W0320 13:23:33.410544 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z Mar 20 13:23:33 crc kubenswrapper[4856]: E0320 13:23:33.410646 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:23:33 crc kubenswrapper[4856]: I0320 13:23:33.747016 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:33Z is after 2026-02-23T05:33:13Z Mar 20 13:23:34 crc kubenswrapper[4856]: I0320 13:23:34.747022 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:34Z is after 2026-02-23T05:33:13Z Mar 20 13:23:35 crc kubenswrapper[4856]: E0320 13:23:35.371047 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.641379 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.643241 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.643336 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.643356 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.643395 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:35 crc kubenswrapper[4856]: E0320 13:23:35.648426 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:23:35 crc kubenswrapper[4856]: I0320 13:23:35.748081 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:35Z is after 2026-02-23T05:33:13Z Mar 20 13:23:35 crc kubenswrapper[4856]: E0320 13:23:35.916170 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.747308 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:36Z is after 2026-02-23T05:33:13Z Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.819968 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.821619 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.821702 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.821724 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:36 crc kubenswrapper[4856]: I0320 13:23:36.822701 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:37 crc kubenswrapper[4856]: I0320 13:23:37.747791 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.035143 4856 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39886->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.035243 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39886->192.168.126.11:10357: read: connection reset by peer" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.035383 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.035648 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.037354 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.037391 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.037401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.037859 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.038027 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339" gracePeriod=30 Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.188110 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.188512 4856 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339" exitCode=255 Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.188574 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339"} Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.189922 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.192987 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae"} Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.193125 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.194687 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.194727 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.194740 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:38 crc kubenswrapper[4856]: I0320 13:23:38.747659 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:38Z is after 2026-02-23T05:33:13Z Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.198679 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.199334 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.202184 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" exitCode=255 Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.202300 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae"} Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.202370 4856 scope.go:117] "RemoveContainer" containerID="4394fccbb80c2a0ccd6b7611790062861293d1db9d9c580368de0c526b5bb5d3" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.202608 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.203874 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.203923 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.203935 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.204644 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:23:39 crc kubenswrapper[4856]: E0320 13:23:39.204873 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.207629 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.208233 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e"} Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.208428 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.209817 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.209862 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.209885 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:39 crc kubenswrapper[4856]: I0320 13:23:39.747357 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2026-02-23T05:33:13Z Mar 20 13:23:39 crc kubenswrapper[4856]: E0320 13:23:39.898713 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f6951fe84b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,LastTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.213604 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.216886 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.218076 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.218132 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.218151 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:40 crc kubenswrapper[4856]: I0320 13:23:40.748605 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2026-02-23T05:33:13Z Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.295703 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.295885 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.297175 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.297202 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.297211 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4856]: I0320 13:23:41.749731 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:42 crc kubenswrapper[4856]: E0320 13:23:42.380348 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.648619 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.650973 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.651018 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.651030 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.651061 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:42 crc kubenswrapper[4856]: E0320 13:23:42.657539 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.747131 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.852722 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.852843 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.854029 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.854074 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.854084 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:42 crc kubenswrapper[4856]: I0320 13:23:42.854465 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:23:42 crc kubenswrapper[4856]: E0320 13:23:42.854776 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.748111 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.893568 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.894346 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.896669 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.896861 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.896985 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:43 crc kubenswrapper[4856]: I0320 13:23:43.897967 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:23:43 crc kubenswrapper[4856]: E0320 13:23:43.898465 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:44 crc kubenswrapper[4856]: I0320 13:23:44.747543 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:45 crc kubenswrapper[4856]: I0320 13:23:45.750748 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:45 crc kubenswrapper[4856]: I0320 13:23:45.882014 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:23:45 crc kubenswrapper[4856]: I0320 13:23:45.900070 4856 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:23:45 crc kubenswrapper[4856]: E0320 13:23:45.916539 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.515392 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.515631 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.517125 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.517169 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.517187 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:46 crc kubenswrapper[4856]: I0320 13:23:46.750094 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:47 crc kubenswrapper[4856]: I0320 13:23:47.749236 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:48 crc kubenswrapper[4856]: I0320 13:23:48.749877 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.387587 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.516254 4856 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.516382 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.657651 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.659474 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.659696 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.659883 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.660089 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.663924 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:23:49 crc kubenswrapper[4856]: I0320 13:23:49.743555 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.906693 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f6951fe84b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,LastTimestamp:2026-03-20 13:23:05.740526774 +0000 UTC m=+0.621552934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.914094 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.921830 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.929157 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.936157 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695c211304 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.910563588 +0000 UTC m=+0.791589718,LastTimestamp:2026-03-20 13:23:05.910563588 +0000 UTC m=+0.791589718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.944596 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.920728398 +0000 UTC m=+0.801754538,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.951503 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.920852523 +0000 UTC m=+0.801878663,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.958628 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.920956237 +0000 UTC m=+0.801982377,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.965711 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.92232423 +0000 UTC m=+0.803350370,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.972312 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.92234307 +0000 UTC m=+0.803369210,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.979444 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.922353451 +0000 UTC m=+0.803379591,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.986370 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.922430344 +0000 UTC m=+0.803456484,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.990224 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.922443824 +0000 UTC m=+0.803469964,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.993504 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.922454585 +0000 UTC m=+0.803480725,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:49 crc kubenswrapper[4856]: E0320 13:23:49.996663 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.923436103 +0000 UTC m=+0.804462243,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.000328 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.923448123 +0000 UTC m=+0.804474263,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.006380 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.923457954 +0000 UTC m=+0.804484094,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.013302 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.923906541 +0000 UTC m=+0.804932691,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.020434 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.92415171 +0000 UTC m=+0.805177860,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.027084 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.924175851 +0000 UTC m=+0.805201991,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.034512 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.924211012 +0000 UTC m=+0.805237152,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.041558 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.924222673 +0000 UTC m=+0.805248813,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.048807 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f695695df1e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f695695df1e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817554718 +0000 UTC m=+0.698580888,LastTimestamp:2026-03-20 13:23:05.924303437 +0000 UTC m=+0.805329567,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.055823 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569509b6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569509b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817500086 +0000 UTC m=+0.698526246,LastTimestamp:2026-03-20 13:23:05.924792215 +0000 UTC m=+0.805818355,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.063257 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f69569598cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f69569598cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:05.817536717 +0000 UTC m=+0.698562877,LastTimestamp:2026-03-20 13:23:05.924811836 +0000 UTC m=+0.805837976,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.071567 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f696911c0da openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.127663322 +0000 UTC m=+1.008689492,LastTimestamp:2026-03-20 13:23:06.127663322 +0000 UTC m=+1.008689492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.078812 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f6972b0988d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.289068173 +0000 UTC m=+1.170094313,LastTimestamp:2026-03-20 13:23:06.289068173 +0000 UTC m=+1.170094313,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.085216 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6974402d29 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.315255081 +0000 UTC m=+1.196281231,LastTimestamp:2026-03-20 13:23:06.315255081 +0000 UTC m=+1.196281231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.092739 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f6976d11847 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.358306887 +0000 UTC m=+1.239333027,LastTimestamp:2026-03-20 13:23:06.358306887 +0000 UTC m=+1.239333027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.097335 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69789544ce openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.387940558 +0000 UTC m=+1.268966698,LastTimestamp:2026-03-20 13:23:06.387940558 +0000 UTC m=+1.268966698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.099153 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f698db5540b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.742363147 +0000 UTC m=+1.623389287,LastTimestamp:2026-03-20 13:23:06.742363147 +0000 UTC m=+1.623389287,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.105399 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f698dd65167 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.744525159 +0000 UTC m=+1.625551309,LastTimestamp:2026-03-20 13:23:06.744525159 +0000 UTC m=+1.625551309,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.110872 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f698dd75bc8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.744593352 +0000 UTC m=+1.625619492,LastTimestamp:2026-03-20 13:23:06.744593352 +0000 UTC m=+1.625619492,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.116058 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f698dd9c3b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.744751028 +0000 UTC m=+1.625777178,LastTimestamp:2026-03-20 13:23:06.744751028 +0000 UTC m=+1.625777178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.122476 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f698de5846a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.745521258 +0000 UTC m=+1.626547408,LastTimestamp:2026-03-20 13:23:06.745521258 +0000 UTC m=+1.626547408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.129109 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f698eb1f9ba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.758920634 +0000 UTC m=+1.639946774,LastTimestamp:2026-03-20 13:23:06.758920634 +0000 UTC m=+1.639946774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.134906 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f698ec1799c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.759936412 +0000 UTC m=+1.640962542,LastTimestamp:2026-03-20 13:23:06.759936412 +0000 UTC m=+1.640962542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.141669 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f698ece4b25 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.760776485 +0000 UTC m=+1.641802615,LastTimestamp:2026-03-20 13:23:06.760776485 +0000 UTC m=+1.641802615,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.148787 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f698ed6103e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.761285694 +0000 UTC m=+1.642311864,LastTimestamp:2026-03-20 13:23:06.761285694 +0000 UTC m=+1.642311864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.157262 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f698f106ee4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.765111012 +0000 UTC m=+1.646137152,LastTimestamp:2026-03-20 13:23:06.765111012 +0000 UTC m=+1.646137152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.162813 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f698f2bf5d7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.766915031 +0000 UTC m=+1.647941171,LastTimestamp:2026-03-20 13:23:06.766915031 +0000 UTC m=+1.647941171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.168468 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f6992f67c85 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.830519429 +0000 UTC m=+1.711545559,LastTimestamp:2026-03-20 13:23:06.830519429 +0000 UTC m=+1.711545559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.174893 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6993294393 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.833847187 +0000 UTC m=+1.714873317,LastTimestamp:2026-03-20 13:23:06.833847187 +0000 UTC m=+1.714873317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.182606 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f6993433f28 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.835549992 +0000 UTC m=+1.716576122,LastTimestamp:2026-03-20 13:23:06.835549992 +0000 UTC m=+1.716576122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.187943 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f6993726fbe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.838642622 +0000 UTC m=+1.719668752,LastTimestamp:2026-03-20 13:23:06.838642622 +0000 UTC m=+1.719668752,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.193146 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69a2c4288f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.095656591 +0000 UTC m=+1.976682721,LastTimestamp:2026-03-20 13:23:07.095656591 +0000 UTC m=+1.976682721,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.199749 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69a2e0e0f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.097538804 +0000 UTC m=+1.978564924,LastTimestamp:2026-03-20 13:23:07.097538804 +0000 UTC m=+1.978564924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.207129 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f69a2e42baf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.097754543 +0000 UTC m=+1.978780673,LastTimestamp:2026-03-20 13:23:07.097754543 +0000 UTC m=+1.978780673,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.214239 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f69a2e82fed openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.098017773 +0000 UTC m=+1.979043903,LastTimestamp:2026-03-20 13:23:07.098017773 +0000 UTC m=+1.979043903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.219877 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69a2ea8fff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.098173439 +0000 UTC m=+1.979199569,LastTimestamp:2026-03-20 13:23:07.098173439 +0000 UTC m=+1.979199569,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.225220 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69a42510c8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.118784712 +0000 UTC m=+1.999810852,LastTimestamp:2026-03-20 13:23:07.118784712 +0000 UTC m=+1.999810852,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.231961 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69a43ef321 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.120481057 +0000 UTC m=+2.001507187,LastTimestamp:2026-03-20 13:23:07.120481057 +0000 UTC m=+2.001507187,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.240570 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69a44a5ed5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.121229525 +0000 UTC m=+2.002255655,LastTimestamp:2026-03-20 13:23:07.121229525 +0000 UTC m=+2.002255655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.247656 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69a44ceef4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.121397492 +0000 UTC m=+2.002423612,LastTimestamp:2026-03-20 13:23:07.121397492 +0000 UTC m=+2.002423612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.253594 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f69a44e4bcc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.121486796 +0000 UTC m=+2.002512926,LastTimestamp:2026-03-20 13:23:07.121486796 +0000 UTC m=+2.002512926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.257796 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f69a46f5cc5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.123653829 +0000 UTC m=+2.004679959,LastTimestamp:2026-03-20 13:23:07.123653829 +0000 UTC m=+2.004679959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.263843 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69a4bdf4e1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.128804577 +0000 UTC m=+2.009830697,LastTimestamp:2026-03-20 13:23:07.128804577 +0000 UTC m=+2.009830697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.269756 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69a4c43af5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.129215733 +0000 UTC m=+2.010241863,LastTimestamp:2026-03-20 13:23:07.129215733 +0000 UTC m=+2.010241863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.275428 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69b126c434 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.336999988 +0000 UTC m=+2.218026138,LastTimestamp:2026-03-20 13:23:07.336999988 +0000 UTC m=+2.218026138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.282000 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69b12a72fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.337241339 +0000 UTC m=+2.218267479,LastTimestamp:2026-03-20 13:23:07.337241339 +0000 UTC m=+2.218267479,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.288371 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69b12e8369 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.337507689 +0000 UTC m=+2.218533819,LastTimestamp:2026-03-20 13:23:07.337507689 +0000 UTC m=+2.218533819,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.294567 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69b1f27204 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.350348292 +0000 UTC m=+2.231374442,LastTimestamp:2026-03-20 13:23:07.350348292 +0000 UTC m=+2.231374442,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.302025 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69b1fe5be7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.351129063 +0000 UTC m=+2.232155193,LastTimestamp:2026-03-20 13:23:07.351129063 +0000 UTC m=+2.232155193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.308540 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69b224fc90 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.35366056 +0000 UTC m=+2.234686700,LastTimestamp:2026-03-20 13:23:07.35366056 +0000 UTC m=+2.234686700,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.314963 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69b2341ac0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.354651328 +0000 UTC m=+2.235677458,LastTimestamp:2026-03-20 13:23:07.354651328 +0000 UTC m=+2.235677458,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.321771 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69b2434959 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.355646297 +0000 UTC m=+2.236672447,LastTimestamp:2026-03-20 13:23:07.355646297 +0000 UTC m=+2.236672447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.327752 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69b26b0fcf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.358253007 +0000 UTC m=+2.239279137,LastTimestamp:2026-03-20 13:23:07.358253007 +0000 UTC m=+2.239279137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.333803 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69be992ec5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.562602181 +0000 UTC m=+2.443628321,LastTimestamp:2026-03-20 13:23:07.562602181 +0000 UTC m=+2.443628321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.340043 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69bebb04aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.564819626 +0000 UTC m=+2.445845766,LastTimestamp:2026-03-20 13:23:07.564819626 +0000 UTC m=+2.445845766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.346860 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69becb0f1c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.565870876 +0000 UTC m=+2.446897016,LastTimestamp:2026-03-20 13:23:07.565870876 +0000 UTC m=+2.446897016,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.353489 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69bfd0c222 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.583021602 +0000 UTC m=+2.464047742,LastTimestamp:2026-03-20 13:23:07.583021602 +0000 UTC m=+2.464047742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.359499 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69bfe29783 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.584190339 +0000 UTC m=+2.465216469,LastTimestamp:2026-03-20 13:23:07.584190339 +0000 UTC m=+2.465216469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.366169 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f69c005c8b3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.586496691 +0000 UTC m=+2.467522831,LastTimestamp:2026-03-20 13:23:07.586496691 +0000 UTC m=+2.467522831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.372813 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69c01490c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.587465412 +0000 UTC m=+2.468491532,LastTimestamp:2026-03-20 13:23:07.587465412 +0000 UTC m=+2.468491532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.379468 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69ca3f8767 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.758053223 +0000 UTC m=+2.639079353,LastTimestamp:2026-03-20 13:23:07.758053223 +0000 UTC m=+2.639079353,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.385231 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69cb331957 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.774015831 +0000 UTC m=+2.655041951,LastTimestamp:2026-03-20 13:23:07.774015831 +0000 UTC m=+2.655041951,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.391099 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69cb420727 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.774994215 +0000 UTC m=+2.656020345,LastTimestamp:2026-03-20 13:23:07.774994215 +0000 UTC m=+2.656020345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.401480 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f69cfc8905f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.850920031 +0000 UTC m=+2.731946161,LastTimestamp:2026-03-20 13:23:07.850920031 +0000 UTC m=+2.731946161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.408305 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69d842fa2f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.993160239 +0000 UTC m=+2.874186369,LastTimestamp:2026-03-20 13:23:07.993160239 +0000 UTC m=+2.874186369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.414723 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69d9673aff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:08.012313343 +0000 UTC m=+2.893339483,LastTimestamp:2026-03-20 13:23:08.012313343 +0000 UTC m=+2.893339483,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.421220 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f69db876078 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:08.04797452 +0000 UTC m=+2.929000650,LastTimestamp:2026-03-20 13:23:08.04797452 +0000 UTC m=+2.929000650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.427454 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f69dc3c15d8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:08.059817432 +0000 UTC m=+2.940843572,LastTimestamp:2026-03-20 13:23:08.059817432 +0000 UTC m=+2.940843572,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.436010 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a0c46f803 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:08.865837059 +0000 UTC m=+3.746863189,LastTimestamp:2026-03-20 13:23:08.865837059 +0000 UTC m=+3.746863189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.444601 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a1964f299 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.085905561 +0000 UTC m=+3.966931731,LastTimestamp:2026-03-20 13:23:09.085905561 +0000 UTC m=+3.966931731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.450934 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a19fa0aba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.095676602 +0000 UTC m=+3.976702762,LastTimestamp:2026-03-20 13:23:09.095676602 +0000 UTC m=+3.976702762,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.457569 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a1a0d7779 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.096949625 +0000 UTC m=+3.977975765,LastTimestamp:2026-03-20 13:23:09.096949625 +0000 UTC m=+3.977975765,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.463898 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a26fcdd89 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.313965449 +0000 UTC m=+4.194991609,LastTimestamp:2026-03-20 13:23:09.313965449 +0000 UTC m=+4.194991609,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.470111 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a27e53d50 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.32919432 +0000 UTC m=+4.210220460,LastTimestamp:2026-03-20 13:23:09.32919432 +0000 UTC m=+4.210220460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.475766 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a27fba228 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.330661928 +0000 UTC m=+4.211688068,LastTimestamp:2026-03-20 13:23:09.330661928 +0000 UTC m=+4.211688068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.482121 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a35b5d22b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.560967723 +0000 UTC m=+4.441993863,LastTimestamp:2026-03-20 13:23:09.560967723 +0000 UTC m=+4.441993863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.487953 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a36a2f08c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.576507532 +0000 UTC m=+4.457533672,LastTimestamp:2026-03-20 13:23:09.576507532 +0000 UTC m=+4.457533672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.492193 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a36b3fe99 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.577625241 +0000 UTC m=+4.458651381,LastTimestamp:2026-03-20 13:23:09.577625241 +0000 UTC m=+4.458651381,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.495988 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a462f2b2d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.837355821 +0000 UTC m=+4.718381991,LastTimestamp:2026-03-20 13:23:09.837355821 +0000 UTC m=+4.718381991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.504639 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a47010421 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.851108385 +0000 UTC m=+4.732134545,LastTimestamp:2026-03-20 13:23:09.851108385 +0000 UTC m=+4.732134545,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.506492 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a4716fefe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:09.852548862 +0000 UTC m=+4.733575032,LastTimestamp:2026-03-20 13:23:09.852548862 +0000 UTC m=+4.733575032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.510739 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a56a9f2d6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:10.113837782 +0000 UTC m=+4.994863942,LastTimestamp:2026-03-20 13:23:10.113837782 +0000 UTC m=+4.994863942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.516558 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f6a57b816b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:10.131541687 +0000 UTC m=+5.012567857,LastTimestamp:2026-03-20 13:23:10.131541687 +0000 UTC m=+5.012567857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.526074 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f69cb420727\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69cb420727 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.774994215 +0000 UTC m=+2.656020345,LastTimestamp:2026-03-20 13:23:19.113638128 +0000 UTC m=+13.994664268,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.532472 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f69d842fa2f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69d842fa2f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.993160239 +0000 UTC m=+2.874186369,LastTimestamp:2026-03-20 13:23:19.32863678 +0000 UTC m=+14.209662920,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.536344 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f69d9673aff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f69d9673aff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:08.012313343 +0000 UTC m=+2.893339483,LastTimestamp:2026-03-20 13:23:19.336842661 +0000 UTC m=+14.217868811,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.542738 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f6c870f5a7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:23:50 crc kubenswrapper[4856]: body: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.515724415 +0000 UTC m=+14.396750585,LastTimestamp:2026-03-20 13:23:19.515724415 +0000 UTC m=+14.396750585,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.549062 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f6c87109653 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.515805267 +0000 UTC m=+14.396831427,LastTimestamp:2026-03-20 13:23:19.515805267 +0000 UTC m=+14.396831427,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.554842 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f6c9db9876f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:23:50 crc kubenswrapper[4856]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:23:50 crc kubenswrapper[4856]: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.895975791 +0000 UTC m=+14.777001951,LastTimestamp:2026-03-20 13:23:19.895975791 +0000 UTC m=+14.777001951,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.558785 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f6c9dba5290 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.896027792 +0000 UTC m=+14.777053952,LastTimestamp:2026-03-20 13:23:19.896027792 +0000 UTC m=+14.777053952,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.565188 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f6c9db9876f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f6c9db9876f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:23:50 crc kubenswrapper[4856]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:23:50 crc kubenswrapper[4856]: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.895975791 +0000 UTC m=+14.777001951,LastTimestamp:2026-03-20 13:23:19.905134477 +0000 UTC m=+14.786160617,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.571479 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f6c9dba5290\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f6c9dba5290 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.896027792 +0000 UTC m=+14.777053952,LastTimestamp:2026-03-20 13:23:19.905196518 +0000 UTC m=+14.786222658,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.577833 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f6c870f5a7f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f6c870f5a7f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:23:50 crc kubenswrapper[4856]: body: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.515724415 +0000 UTC m=+14.396750585,LastTimestamp:2026-03-20 13:23:29.515927603 +0000 UTC m=+24.396953763,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.581827 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f6c87109653\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f6c87109653 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:19.515805267 +0000 UTC m=+14.396831427,LastTimestamp:2026-03-20 13:23:29.515987615 +0000 UTC m=+24.397013785,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.586661 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f70d6e85cf7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:39886->192.168.126.11:10357: read: connection reset by peer Mar 20 13:23:50 crc kubenswrapper[4856]: body: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:38.035215607 +0000 UTC m=+32.916241807,LastTimestamp:2026-03-20 13:23:38.035215607 +0000 UTC m=+32.916241807,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.592625 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f70d6ea0b0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:39886->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:38.03532571 +0000 UTC m=+32.916351930,LastTimestamp:2026-03-20 13:23:38.03532571 +0000 UTC m=+32.916351930,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.599172 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f70d712864a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:38.037978698 +0000 UTC m=+32.919004818,LastTimestamp:2026-03-20 13:23:38.037978698 +0000 UTC m=+32.919004818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.605851 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f698ed6103e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f698ed6103e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:06.761285694 +0000 UTC m=+1.642311864,LastTimestamp:2026-03-20 13:23:38.055710424 +0000 UTC m=+32.936736594,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.613408 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f69a2ea8fff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69a2ea8fff openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.098173439 +0000 UTC m=+1.979199569,LastTimestamp:2026-03-20 13:23:38.26035381 +0000 UTC m=+33.141379960,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.620606 4856 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f69a44ceef4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f69a44ceef4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:07.121397492 +0000 UTC m=+2.002423612,LastTimestamp:2026-03-20 13:23:38.269307999 +0000 UTC m=+33.150334119,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.628394 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:23:50 crc kubenswrapper[4856]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f73833cb6cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 13:23:50 crc kubenswrapper[4856]: body: Mar 20 13:23:50 crc kubenswrapper[4856]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:49.516359371 +0000 UTC m=+44.397385521,LastTimestamp:2026-03-20 13:23:49.516359371 +0000 UTC m=+44.397385521,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:23:50 crc kubenswrapper[4856]: > Mar 20 13:23:50 crc kubenswrapper[4856]: E0320 13:23:50.635563 4856 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f73833d8304 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:23:49.516411652 +0000 UTC m=+44.397437802,LastTimestamp:2026-03-20 13:23:49.516411652 +0000 UTC m=+44.397437802,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:23:50 crc kubenswrapper[4856]: I0320 13:23:50.749757 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:51 crc kubenswrapper[4856]: I0320 13:23:51.751095 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:52 crc kubenswrapper[4856]: I0320 13:23:52.748987 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:53 crc kubenswrapper[4856]: W0320 13:23:53.066183 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:23:53 crc kubenswrapper[4856]: E0320 13:23:53.066258 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:23:53 crc kubenswrapper[4856]: I0320 13:23:53.749308 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:53 crc kubenswrapper[4856]: W0320 13:23:53.918817 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:23:53 crc kubenswrapper[4856]: E0320 13:23:53.918895 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:23:54 crc kubenswrapper[4856]: I0320 13:23:54.748678 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:54 crc kubenswrapper[4856]: W0320 13:23:54.831477 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:23:54 crc kubenswrapper[4856]: E0320 13:23:54.831548 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.749750 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.819127 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.820592 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.820629 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.820639 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:55 crc kubenswrapper[4856]: I0320 13:23:55.821243 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:23:55 crc kubenswrapper[4856]: E0320 13:23:55.821474 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:55 crc kubenswrapper[4856]: E0320 13:23:55.916716 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:23:56 crc kubenswrapper[4856]: E0320 13:23:56.393870 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.520703 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.520978 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.522489 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.522557 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.522575 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.525686 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.664597 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.666320 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.666374 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.666394 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.666428 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:23:56 crc kubenswrapper[4856]: E0320 13:23:56.675920 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:23:56 crc kubenswrapper[4856]: I0320 13:23:56.748382 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:57 crc kubenswrapper[4856]: W0320 13:23:57.179863 4856 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:57 crc kubenswrapper[4856]: E0320 13:23:57.179961 4856 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.261044 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.262443 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.262495 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.262514 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.748906 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.814834 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.815075 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.816813 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.816840 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:57 crc kubenswrapper[4856]: I0320 13:23:57.816848 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:58 crc kubenswrapper[4856]: I0320 13:23:58.746454 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:23:59 crc kubenswrapper[4856]: I0320 13:23:59.749235 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:00 crc kubenswrapper[4856]: I0320 13:24:00.747683 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:01 crc kubenswrapper[4856]: I0320 13:24:01.748594 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:02 crc kubenswrapper[4856]: I0320 13:24:02.749366 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:03 crc kubenswrapper[4856]: E0320 13:24:03.403001 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.676474 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.677801 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.677874 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.677893 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.678911 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:03 crc kubenswrapper[4856]: E0320 13:24:03.688030 4856 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:24:03 crc kubenswrapper[4856]: I0320 13:24:03.749569 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:04 crc kubenswrapper[4856]: I0320 13:24:04.748669 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:05 crc kubenswrapper[4856]: I0320 13:24:05.749329 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:05 crc kubenswrapper[4856]: E0320 13:24:05.917100 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:06 crc kubenswrapper[4856]: I0320 13:24:06.749439 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:07 crc kubenswrapper[4856]: I0320 13:24:07.746205 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:08 crc kubenswrapper[4856]: I0320 13:24:08.753332 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.749703 4856 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.819469 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.821081 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.821122 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.821133 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:09 crc kubenswrapper[4856]: I0320 13:24:09.821705 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.300185 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.303152 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009"} Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.303335 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.304809 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.304865 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.304884 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.410750 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.457688 4856 csr.go:261] certificate signing request csr-ccrdv is approved, waiting to be issued Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.468121 4856 csr.go:257] certificate signing request csr-ccrdv is issued Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.487055 4856 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.583126 4856 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.688543 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.690036 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.690128 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.690154 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.690402 4856 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.705381 4856 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.705642 4856 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.705660 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.709527 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.709558 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.709568 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.709587 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.709598 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:10Z","lastTransitionTime":"2026-03-20T13:24:10Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.726606 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.734922 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.734982 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.735003 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.735031 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.735077 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:10Z","lastTransitionTime":"2026-03-20T13:24:10Z","reason":"KubeletNotReady","message":"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]"} Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.750487 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"[container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?, CSINode is not yet initialized]\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.757175 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.757210 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.757219 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.757232 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.757240 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:10Z","lastTransitionTime":"2026-03-20T13:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.767245 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.773260 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.773350 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.773374 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.773400 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:10 crc kubenswrapper[4856]: I0320 13:24:10.773426 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:10Z","lastTransitionTime":"2026-03-20T13:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.784763 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.784866 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.784893 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.885522 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:10 crc kubenswrapper[4856]: E0320 13:24:10.986023 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.086647 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.187080 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.288098 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.307196 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.307888 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.309430 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" exitCode=255 Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.309467 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009"} Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.309503 4856 scope.go:117] "RemoveContainer" containerID="7d10430fc320006c550bc18630778c45651d4bbf1d3aa9679c30c0e4fc02b1ae" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.309729 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.311096 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.311169 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.311183 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.312242 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.312605 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.388717 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.469746 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-13 12:01:26.828944315 +0000 UTC Mar 20 13:24:11 crc kubenswrapper[4856]: I0320 13:24:11.469791 4856 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6430h37m15.359156978s for next certificate rotation Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.488848 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.589367 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.690433 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.790849 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:11 crc kubenswrapper[4856]: E0320 13:24:11.891204 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.066346 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.167462 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.268623 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.315251 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.369680 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.470328 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.571326 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.671791 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.772162 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.853031 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.853323 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.855232 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.855298 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.855313 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:12 crc kubenswrapper[4856]: I0320 13:24:12.855966 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.856154 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.872371 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:12 crc kubenswrapper[4856]: E0320 13:24:12.972752 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.073713 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.174093 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.274324 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.374800 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.475699 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.576450 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.677379 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.778547 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.879238 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.893499 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.893642 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.894932 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.894998 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.895020 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:13 crc kubenswrapper[4856]: I0320 13:24:13.895906 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.896116 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:13 crc kubenswrapper[4856]: E0320 13:24:13.979971 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.080462 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.181352 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.282144 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.382343 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.482699 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.583772 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.684356 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.785192 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.885401 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:14 crc kubenswrapper[4856]: E0320 13:24:14.986173 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.086775 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.187694 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.287890 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.388739 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.489763 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.590651 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.690819 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.791913 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.892832 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.918280 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:15 crc kubenswrapper[4856]: E0320 13:24:15.993312 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.094338 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.195549 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.296552 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.397444 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.498060 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.599169 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.699650 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.800484 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:16 crc kubenswrapper[4856]: E0320 13:24:16.900936 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.001975 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.103027 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.203930 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.304048 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.405066 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.505949 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.606342 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.707113 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.807642 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:17 crc kubenswrapper[4856]: E0320 13:24:17.907811 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.007908 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.108863 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.209986 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.310574 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.411693 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.512848 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.613034 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.713704 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.814703 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:18 crc kubenswrapper[4856]: E0320 13:24:18.915326 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.015560 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.116312 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.217133 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.317947 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.418634 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.519849 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.620737 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.721242 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.822334 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:19 crc kubenswrapper[4856]: E0320 13:24:19.923417 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.024460 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.125011 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.225939 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.326414 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.426574 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.527079 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.627748 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.728725 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.829827 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.893783 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.899494 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.899555 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.899570 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.899592 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.899605 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:20Z","lastTransitionTime":"2026-03-20T13:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.913624 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.924300 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.924374 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.924389 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.924488 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.924503 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:20Z","lastTransitionTime":"2026-03-20T13:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.941616 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.951738 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.951789 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.951804 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.951825 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.951840 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:20Z","lastTransitionTime":"2026-03-20T13:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.966308 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.978194 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.978250 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.978296 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.978323 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:20 crc kubenswrapper[4856]: I0320 13:24:20.978340 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:20Z","lastTransitionTime":"2026-03-20T13:24:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.994624 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.994867 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:24:20 crc kubenswrapper[4856]: E0320 13:24:20.994923 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.095550 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.196719 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.297051 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.397788 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.498508 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.598660 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.699629 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.800611 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:21 crc kubenswrapper[4856]: E0320 13:24:21.901128 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.001708 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.102331 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.202677 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.303458 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.403832 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.504408 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.604617 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.705408 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.805939 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:22 crc kubenswrapper[4856]: E0320 13:24:22.907079 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.007857 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.108901 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.210021 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.310581 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.410920 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.511378 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.611883 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.712512 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.813676 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:23 crc kubenswrapper[4856]: E0320 13:24:23.914065 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.014891 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.115453 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.216016 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.316133 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.417200 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.518201 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.618625 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.719808 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.820578 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:24 crc kubenswrapper[4856]: E0320 13:24:24.921053 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.022464 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.123706 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.224827 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.324927 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.425378 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.526557 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.627485 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.728244 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: I0320 13:24:25.819118 4856 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:24:25 crc kubenswrapper[4856]: I0320 13:24:25.820697 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:25 crc kubenswrapper[4856]: I0320 13:24:25.820770 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:25 crc kubenswrapper[4856]: I0320 13:24:25.820788 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:25 crc kubenswrapper[4856]: I0320 13:24:25.821799 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.822125 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.828948 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.919447 4856 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:24:25 crc kubenswrapper[4856]: E0320 13:24:25.929027 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.030146 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.130969 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.231178 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.332301 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.433327 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.533573 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.633806 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: I0320 13:24:26.695473 4856 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.734511 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.835120 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:26 crc kubenswrapper[4856]: E0320 13:24:26.935322 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.036068 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.136907 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.237740 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.338699 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.439711 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.540222 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.640381 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.740735 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.841824 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:27 crc kubenswrapper[4856]: E0320 13:24:27.942587 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.043605 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.144002 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.244965 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.345325 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.445968 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.546409 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.647244 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.747907 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.848055 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:28 crc kubenswrapper[4856]: E0320 13:24:28.948800 4856 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.000110 4856 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.050851 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.050898 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.050916 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.050934 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.050948 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.085094 4856 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.131606 4856 apiserver.go:52] "Watching apiserver" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.135690 4856 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.136347 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.136973 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.137011 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.137080 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.137438 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.137442 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.137620 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.137710 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.138034 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.138093 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.140124 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.142305 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.142805 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.142942 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.143288 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.143425 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.143491 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.143557 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.146363 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.148886 4856 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.153104 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.153234 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.153373 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.153475 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.153629 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.178898 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.179215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.179406 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.179550 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.179960 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.180118 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181262 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181437 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181489 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181538 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181587 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181659 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181704 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181745 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181782 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181823 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181859 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181898 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181940 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181987 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182124 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182174 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182218 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182485 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182732 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182778 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182819 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182860 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183149 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183254 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183328 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183444 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183487 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183529 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183616 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183689 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183873 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183917 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183960 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184002 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184040 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184082 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184150 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184198 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184238 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184405 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184484 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184528 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184672 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184729 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184775 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184816 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184852 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184893 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184932 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181892 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184971 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181926 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182690 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182723 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185013 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185051 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185093 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185170 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185210 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185247 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185321 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185366 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185399 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185438 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185479 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185520 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185557 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185598 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185667 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185719 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185765 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185807 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185842 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185881 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185925 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185961 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186001 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186043 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186084 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186123 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186169 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186213 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186251 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186316 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186380 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186529 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186576 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186617 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186667 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186701 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186740 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186781 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186892 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186948 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187690 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187764 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187911 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187954 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188021 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188069 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188106 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.182943 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.181262 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183512 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.183550 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184218 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.184923 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185075 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188761 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188842 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185564 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188906 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185664 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185971 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.185990 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186424 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186493 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186431 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186952 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.186975 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187489 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187559 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187637 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187817 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189144 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189171 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189229 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189248 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190486 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188210 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188542 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190559 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188739 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188729 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.188836 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190608 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190659 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.187859 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189551 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189704 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189811 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190133 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190147 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190706 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.190861 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.191718 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.191790 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192678 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192720 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192744 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192743 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192769 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.189434 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192869 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.192993 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193039 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193077 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193086 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193112 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193148 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193183 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193224 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193233 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193263 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193333 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193317 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193369 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193411 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193446 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193480 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193512 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193544 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193576 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193610 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193681 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193715 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193770 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193805 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193840 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193872 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193907 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193941 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193974 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194009 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194042 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194076 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194111 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194147 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194182 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194219 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194254 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194319 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194352 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194385 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194419 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194458 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194491 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194525 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194575 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194615 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194649 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194683 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194719 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194824 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194858 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194895 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194932 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194972 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195017 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195057 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195091 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195128 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195161 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195196 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195230 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195312 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195356 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195390 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195432 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195466 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195513 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195548 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195580 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195618 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195672 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195725 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195769 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195806 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195842 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195879 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195919 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193367 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193321 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193780 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193807 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.193518 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194611 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194571 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194715 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203191 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194844 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194928 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.194957 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195233 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195441 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195491 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195673 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195781 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.195886 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.195967 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:29.695940505 +0000 UTC m=+84.576966645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196057 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196138 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196218 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196306 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196781 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196864 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.196876 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198109 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198247 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198370 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198535 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198724 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198852 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.198950 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.199186 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.199288 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.199354 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.199538 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.199954 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200037 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200076 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200125 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200174 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200202 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200221 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200439 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200656 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200688 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200748 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200848 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200818 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200880 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.201022 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.201046 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.200545 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.201800 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.201821 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.202035 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.202748 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.202748 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.202774 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203010 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203038 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203261 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203816 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.204362 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.204711 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.205104 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.205177 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.205209 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.205283 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206145 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206210 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206378 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206436 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206520 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206564 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206610 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206628 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.203254 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206647 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206950 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206647 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207002 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206645 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.206631 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207048 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.207125 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207175 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207173 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.207210 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:29.707184944 +0000 UTC m=+84.588211104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207244 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207313 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.207344 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.207412 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:29.707392519 +0000 UTC m=+84.588418779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207353 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207778 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207810 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.207860 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.208505 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.209466 4856 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.210013 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211699 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211769 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211916 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211928 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211944 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211962 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.211984 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212004 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212021 4856 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212038 4856 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212057 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212077 4856 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212040 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212095 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212209 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212265 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212318 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212379 4856 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212401 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212421 4856 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212479 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212499 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212558 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212580 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212600 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212656 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212676 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212694 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212751 4856 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212770 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212790 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212847 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212866 4856 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212884 4856 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212942 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212964 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212982 4856 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213040 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213059 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213113 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213137 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213157 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213217 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213240 4856 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213259 4856 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213324 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213342 4856 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213402 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213426 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213446 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213503 4856 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213521 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213540 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213601 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213621 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213699 4856 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213730 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213785 4856 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213807 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213826 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213882 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213902 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213920 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213979 4856 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212124 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.212369 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.213999 4856 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214264 4856 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214315 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214339 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214360 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214379 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214399 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214420 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214439 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214458 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214477 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214497 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214517 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214537 4856 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214556 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214576 4856 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214598 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214616 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214634 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214652 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214671 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214691 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214709 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214726 4856 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214744 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214762 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214780 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214800 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214819 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214836 4856 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214854 4856 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214872 4856 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214891 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214908 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214926 4856 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214957 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.214998 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215026 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215053 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215079 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215102 4856 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215125 4856 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215149 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215177 4856 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215205 4856 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215229 4856 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.215249 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216102 4856 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216130 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216149 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216170 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216190 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216264 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216343 4856 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216369 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216396 4856 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216416 4856 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216436 4856 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216455 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216473 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216491 4856 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216508 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.216909 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.218717 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.220629 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.220876 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.224583 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.229405 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.229573 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.229914 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.229933 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.229948 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.230011 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:29.72999396 +0000 UTC m=+84.611020110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.230148 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.230873 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.230899 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.230910 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.230958 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:29.730945125 +0000 UTC m=+84.611971265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.230995 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.232252 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.233780 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.234350 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.234696 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.237650 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.239798 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.239968 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.240127 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.240260 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.240764 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.240848 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241130 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241276 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241320 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241413 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241413 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241531 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241926 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241597 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241601 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241643 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241860 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.241940 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242084 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242011 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242391 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242746 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242848 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.242944 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.245872 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.246923 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247046 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247106 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247242 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247300 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247636 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247749 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.247697 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.248036 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.248492 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.248743 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.248745 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.249015 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.249141 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.249448 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.249461 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.249752 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250144 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250196 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250336 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250598 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250725 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.250909 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.251355 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.251415 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.251558 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.252320 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.254428 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.254443 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.254527 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.254849 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.256772 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257224 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257852 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257888 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257899 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257930 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.257943 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.267065 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.273776 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.276751 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.283064 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.286922 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.290326 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317251 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317366 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317370 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317633 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317658 4856 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317677 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317417 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317692 4856 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317910 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317934 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317948 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317962 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.317974 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318017 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318037 4856 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318055 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318072 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318091 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318108 4856 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318125 4856 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318142 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318159 4856 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318176 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318198 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318217 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318234 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318252 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318305 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318323 4856 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318340 4856 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318359 4856 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318376 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318394 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318411 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318427 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318444 4856 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318461 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318479 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318495 4856 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318513 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318529 4856 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318545 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318562 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318578 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318596 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318615 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318631 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318647 4856 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318664 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318681 4856 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318697 4856 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318713 4856 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318730 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318748 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318765 4856 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318783 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318799 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318817 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318833 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318850 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318867 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318884 4856 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318900 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318916 4856 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318932 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318949 4856 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318969 4856 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.318985 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319002 4856 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319019 4856 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319037 4856 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319054 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319071 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319088 4856 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319105 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.319122 4856 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.361059 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.361137 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.361157 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.361191 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.361215 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.460465 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.464836 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.464934 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.464993 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.465020 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.465074 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.468038 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.474687 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:24:29 crc kubenswrapper[4856]: W0320 13:24:29.501761 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-8753695336ac077477bb74d4fde8f52f0f1d228327cec1547ddaffb9a33eb216 WatchSource:0}: Error finding container 8753695336ac077477bb74d4fde8f52f0f1d228327cec1547ddaffb9a33eb216: Status 404 returned error can't find the container with id 8753695336ac077477bb74d4fde8f52f0f1d228327cec1547ddaffb9a33eb216 Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.567774 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.567834 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.567853 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.567884 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.567902 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.674227 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.674326 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.674347 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.674383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.674403 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.723310 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.723574 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.723678 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:30.72361326 +0000 UTC m=+85.604639430 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.723735 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.723782 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.723882 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:30.723845586 +0000 UTC m=+85.604871876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.723962 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.724072 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:30.724055012 +0000 UTC m=+85.605081182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.777147 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.777194 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.777206 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.777225 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.777238 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.824323 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.824389 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824510 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824515 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824530 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824547 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824550 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824567 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824626 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:30.824605772 +0000 UTC m=+85.705631912 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: E0320 13:24:29.824643 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:30.824635673 +0000 UTC m=+85.705661813 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.827126 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.828896 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.831889 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.833912 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.836342 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.837326 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.838142 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.839588 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.840431 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.841775 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.842925 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.844805 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.845603 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.846321 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.847536 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.848194 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.849582 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.850145 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.850898 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.852199 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.852886 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.854158 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.854730 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.856366 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.857100 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.858132 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.859915 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.860621 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.862182 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.862867 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.864130 4856 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.864263 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.866603 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.868059 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.868744 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.870855 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.872027 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.873327 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.874369 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.875796 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.876347 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.877374 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.878043 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.879063 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.879533 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.880419 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.880944 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.882119 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.882616 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.883512 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.883962 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.885034 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.885589 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.886047 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.888565 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.888595 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.888605 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.888617 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.888625 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.991840 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.991918 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.991941 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.991974 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:29 crc kubenswrapper[4856]: I0320 13:24:29.991995 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:29Z","lastTransitionTime":"2026-03-20T13:24:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.096360 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.096416 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.096439 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.096472 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.096495 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.199583 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.199628 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.199639 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.199655 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.199665 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.302039 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.302093 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.302109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.302132 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.302151 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.372300 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.372391 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5a84875575f18b9a0971a0b7a86250c6c0ba0e8d0304fc2815e3fa6dbe22aa99"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.373419 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8753695336ac077477bb74d4fde8f52f0f1d228327cec1547ddaffb9a33eb216"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.376690 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.376714 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.376724 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5dd45aad267deec8be145ae5bbfd5fde2904c73c792b977965590f73140b1c2b"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.388451 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.399598 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.404361 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.404392 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.404401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.404417 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.404426 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.416798 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.430381 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.440830 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.451712 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.465597 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.479791 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.498133 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.507439 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.507488 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.507500 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.507519 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.507530 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.511389 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.521591 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.533367 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.609799 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.609867 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.609885 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.609906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.609924 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.712715 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.712792 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.712816 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.712845 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.712867 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.733214 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.733340 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.733458 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.733594 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.733623 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:32.733593144 +0000 UTC m=+87.614619284 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.733646 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.733672 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:32.733648485 +0000 UTC m=+87.614674645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.733733 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:32.733712507 +0000 UTC m=+87.614738677 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.816219 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.816303 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.816322 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.816362 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.816380 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.819359 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.819428 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.819369 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.819548 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.819645 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.819734 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.835606 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.835746 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.835942 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.835990 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836004 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836030 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836034 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836052 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836135 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:32.836106426 +0000 UTC m=+87.717132606 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:30 crc kubenswrapper[4856]: E0320 13:24:30.836168 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:32.836155868 +0000 UTC m=+87.717182038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.919373 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.919421 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.919434 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.919451 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:30 crc kubenswrapper[4856]: I0320 13:24:30.919463 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:30Z","lastTransitionTime":"2026-03-20T13:24:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.021778 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.021820 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.021830 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.021847 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.021858 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.124060 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.124101 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.124117 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.124137 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.124155 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.166262 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.166354 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.166376 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.166403 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.166429 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.179940 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.185574 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.185652 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.185673 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.185701 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.185721 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.206811 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.211925 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.211979 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.211996 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.212018 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.212035 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.229000 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.233769 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.233807 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.233823 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.233842 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.233854 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.250054 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.255729 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.255777 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.255789 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.255808 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.255823 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.282425 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:31 crc kubenswrapper[4856]: E0320 13:24:31.282608 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.285081 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.285115 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.285126 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.285148 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.285160 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.387198 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.387242 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.387254 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.387292 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.387312 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.489601 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.489658 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.489676 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.489696 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.489713 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.593401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.593455 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.593472 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.593503 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.593520 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.695734 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.695801 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.695820 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.695845 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.695863 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.798020 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.798361 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.798396 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.798422 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.798454 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.837853 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.901112 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.901175 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.901198 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.901226 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:31 crc kubenswrapper[4856]: I0320 13:24:31.901249 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:31Z","lastTransitionTime":"2026-03-20T13:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.003876 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.003920 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.003935 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.003957 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.003970 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.106652 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.106711 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.106729 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.106753 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.106775 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.209098 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.209775 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.209894 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.210008 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.210096 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.313646 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.313702 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.313719 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.313741 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.313757 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.416137 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.416562 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.416725 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.416907 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.417062 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.520784 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.521322 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.521491 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.521670 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.521824 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.624908 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.625240 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.625439 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.625583 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.625705 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.729024 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.729071 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.729086 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.729104 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.729115 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.755185 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.755426 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:36.755376281 +0000 UTC m=+91.636402451 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.755555 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.755655 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.755796 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.755803 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.755895 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:36.755870153 +0000 UTC m=+91.636896403 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.755930 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:36.755913276 +0000 UTC m=+91.636939546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.818802 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.818887 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.818799 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.819017 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.819235 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.819386 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.831559 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.831611 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.831623 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.831641 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.831653 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.856876 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.856920 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857048 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857067 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857082 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857147 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:36.857131453 +0000 UTC m=+91.738157593 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857220 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857331 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857361 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:32 crc kubenswrapper[4856]: E0320 13:24:32.857457 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:36.857422531 +0000 UTC m=+91.738448711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.934710 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.934788 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.934829 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.934860 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:32 crc kubenswrapper[4856]: I0320 13:24:32.934884 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:32Z","lastTransitionTime":"2026-03-20T13:24:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.037764 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.037810 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.037826 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.037845 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.037859 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.140707 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.140773 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.140797 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.140826 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.140847 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.244892 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.244956 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.244975 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.245000 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.245020 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.348597 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.348676 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.348699 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.348729 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.348749 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.386656 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.405176 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.423670 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.445459 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.452000 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.452069 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.452090 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.452119 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.452175 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.480138 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.496207 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.516761 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.533570 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.554374 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.554433 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.554452 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.554475 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.554492 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.658025 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.658128 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.658147 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.658173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.658191 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.760294 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.760364 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.760387 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.760417 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.760439 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.862658 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.862728 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.862748 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.862772 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.862789 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.965609 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.965668 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.965685 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.965723 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:33 crc kubenswrapper[4856]: I0320 13:24:33.965742 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:33Z","lastTransitionTime":"2026-03-20T13:24:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.067830 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.067883 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.067895 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.067911 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.067922 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.170588 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.170652 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.170669 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.170694 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.170711 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.274988 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.275062 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.275080 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.275114 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.275132 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.377673 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.377721 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.377733 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.377749 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.377760 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.480616 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.480660 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.480672 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.480689 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.480701 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.583412 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.583455 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.583466 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.583483 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.583493 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.686095 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.686170 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.686189 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.686212 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.686235 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.797082 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.797203 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.797222 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.797247 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.797264 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.819115 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.819174 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:34 crc kubenswrapper[4856]: E0320 13:24:34.819259 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.819193 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:34 crc kubenswrapper[4856]: E0320 13:24:34.819433 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:34 crc kubenswrapper[4856]: E0320 13:24:34.819654 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.899824 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.899903 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.899937 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.899965 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:34 crc kubenswrapper[4856]: I0320 13:24:34.899986 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:34Z","lastTransitionTime":"2026-03-20T13:24:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.002990 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.003039 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.003056 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.003079 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.003096 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.105851 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.105898 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.105912 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.105928 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.105938 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.208288 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.208334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.208346 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.208363 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.208374 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.311095 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.311136 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.311147 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.311162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.311173 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.413651 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.413713 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.413735 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.413759 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.413778 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.520534 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.520579 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.520590 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.520605 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.520619 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.624622 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.624670 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.624681 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.624699 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.624727 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.726810 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.726861 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.726878 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.726901 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.726919 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.829731 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.829791 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.829813 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.829839 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.829862 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.834109 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.836961 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.851558 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.866046 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.882716 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.902839 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.920760 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.955219 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.955326 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.955352 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.955383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.955601 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:35Z","lastTransitionTime":"2026-03-20T13:24:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:35 crc kubenswrapper[4856]: I0320 13:24:35.970631 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.057193 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.057229 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.057241 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.057257 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.057291 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.159612 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.159670 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.159686 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.159703 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.159715 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.262579 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.262627 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.262644 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.262671 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.262687 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.365615 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.365682 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.365698 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.365722 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.365746 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.475094 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.475129 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.475138 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.475151 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.475160 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.578248 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.578311 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.578327 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.578349 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.578368 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.681497 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.681897 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.682091 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.682266 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.682513 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.697547 4856 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.785241 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.785302 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.785316 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.785332 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.785343 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.793696 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.793989 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.794052 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:44.793977103 +0000 UTC m=+99.675003243 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.794104 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.794448 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:44.794421374 +0000 UTC m=+99.675447534 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.794549 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.794611 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:44.794602429 +0000 UTC m=+99.675628559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.794762 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.819202 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.819300 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.819378 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.819390 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.819507 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.819568 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.888066 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.888102 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.888110 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.888123 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.888135 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.895600 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.895694 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895814 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895833 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895845 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895894 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:44.895879498 +0000 UTC m=+99.776905638 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895911 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895966 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.895987 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:36 crc kubenswrapper[4856]: E0320 13:24:36.896058 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:44.896034842 +0000 UTC m=+99.777061012 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.998323 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.998366 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.998383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.998404 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:36 crc kubenswrapper[4856]: I0320 13:24:36.998421 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:36Z","lastTransitionTime":"2026-03-20T13:24:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.101598 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.101645 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.101656 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.101673 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.101685 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.204168 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.204218 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.204228 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.204246 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.204257 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.306989 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.307054 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.307078 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.307107 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.307130 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.409547 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.409608 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.409624 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.409643 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.409657 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.511865 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.511918 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.511934 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.511957 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.511974 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.614816 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.614867 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.614886 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.614908 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.614926 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.717145 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.717191 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.717202 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.717218 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.717233 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.819979 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.820023 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.820031 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.820046 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.820056 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.922772 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.922834 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.922850 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.922873 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:37 crc kubenswrapper[4856]: I0320 13:24:37.922890 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:37Z","lastTransitionTime":"2026-03-20T13:24:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.025982 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.026032 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.026044 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.026066 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.026078 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.129249 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.129324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.129338 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.129356 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.129370 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.232186 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.232246 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.232262 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.232315 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.232333 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.335355 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.335430 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.335449 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.335471 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.335488 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.438112 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.438181 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.438200 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.438224 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.438244 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.541671 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.541764 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.541792 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.541826 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.541851 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.644024 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.644079 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.644090 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.644109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.644122 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.746383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.746421 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.746431 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.746446 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.746458 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.819754 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.819759 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:38 crc kubenswrapper[4856]: E0320 13:24:38.819988 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.819759 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:38 crc kubenswrapper[4856]: E0320 13:24:38.820042 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:38 crc kubenswrapper[4856]: E0320 13:24:38.820164 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.849249 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.849300 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.849309 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.849324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.849333 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.952386 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.952442 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.952453 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.952470 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:38 crc kubenswrapper[4856]: I0320 13:24:38.952480 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:38Z","lastTransitionTime":"2026-03-20T13:24:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.054832 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.054877 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.054889 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.054906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.054919 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.165122 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.165159 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.165168 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.165182 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.165193 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.242072 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cb9fx"] Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.242608 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-bvt29"] Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.242754 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.243602 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-chwcj"] Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.243830 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.243937 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.244745 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.247603 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.248516 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.248865 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.250370 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.250802 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.250934 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.251261 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.251700 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.252326 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.266910 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.269231 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.269334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.269359 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.269391 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.269414 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.293744 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.312681 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.333931 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.354589 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.373237 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.373318 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.373335 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.373358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.373375 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.374241 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.393825 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.410135 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.417759 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-cnibin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.417862 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-etc-kubernetes\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.417901 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-daemon-config\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.417968 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-os-release\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418080 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-conf-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418165 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-k8s-cni-cncf-io\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418201 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418322 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418409 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418521 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-cni-binary-copy\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418680 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-multus-certs\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418744 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-kubelet\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418814 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-system-cni-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418851 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-multus\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418881 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-hostroot\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418911 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-cnibin\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418940 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418968 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-system-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.418996 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-bin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419024 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjtnh\" (UniqueName: \"kubernetes.io/projected/209c28b3-bf65-4d96-b67d-531e7463f2d5-kube-api-access-gjtnh\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419065 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlb6h\" (UniqueName: \"kubernetes.io/projected/24c895f6-954c-4211-8957-0d888d862cfa-kube-api-access-mlb6h\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419094 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-netns\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419123 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-os-release\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419152 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svjh5\" (UniqueName: \"kubernetes.io/projected/da4c21dd-2600-4141-bf05-7c18c1932a33-kube-api-access-svjh5\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419181 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/209c28b3-bf65-4d96-b67d-531e7463f2d5-hosts-file\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.419212 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-socket-dir-parent\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.424376 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.437061 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.450790 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.467662 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.476070 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.476117 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.476133 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.476155 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.476172 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.485844 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.498911 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520048 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-conf-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520112 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-os-release\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520160 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520190 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-k8s-cni-cncf-io\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520223 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520252 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520341 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-cni-binary-copy\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520374 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-multus-certs\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520419 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-kubelet\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520463 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-system-cni-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520492 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-multus\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520521 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-hostroot\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520549 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-cnibin\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520587 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520626 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-system-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520665 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-bin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520706 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjtnh\" (UniqueName: \"kubernetes.io/projected/209c28b3-bf65-4d96-b67d-531e7463f2d5-kube-api-access-gjtnh\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520749 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-netns\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520780 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlb6h\" (UniqueName: \"kubernetes.io/projected/24c895f6-954c-4211-8957-0d888d862cfa-kube-api-access-mlb6h\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-os-release\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520843 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svjh5\" (UniqueName: \"kubernetes.io/projected/da4c21dd-2600-4141-bf05-7c18c1932a33-kube-api-access-svjh5\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520871 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/209c28b3-bf65-4d96-b67d-531e7463f2d5-hosts-file\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520900 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-socket-dir-parent\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520967 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-cnibin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.520996 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-etc-kubernetes\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.521027 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-daemon-config\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522346 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-cnibin\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522417 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-kubelet\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522430 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-hostroot\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522493 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-system-cni-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522496 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-multus\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522537 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-multus-certs\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522667 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/209c28b3-bf65-4d96-b67d-531e7463f2d5-hosts-file\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.522752 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-os-release\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523041 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-cnibin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523101 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-daemon-config\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523147 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-socket-dir-parent\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523190 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-k8s-cni-cncf-io\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523194 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-etc-kubernetes\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523231 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-conf-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523320 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-var-lib-cni-bin\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523323 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-os-release\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523387 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-host-run-netns\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523531 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-system-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523792 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/24c895f6-954c-4211-8957-0d888d862cfa-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.523946 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da4c21dd-2600-4141-bf05-7c18c1932a33-multus-cni-dir\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.524070 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.526762 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/24c895f6-954c-4211-8957-0d888d862cfa-cni-binary-copy\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.528714 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da4c21dd-2600-4141-bf05-7c18c1932a33-cni-binary-copy\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.530235 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.550080 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.553387 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svjh5\" (UniqueName: \"kubernetes.io/projected/da4c21dd-2600-4141-bf05-7c18c1932a33-kube-api-access-svjh5\") pod \"multus-chwcj\" (UID: \"da4c21dd-2600-4141-bf05-7c18c1932a33\") " pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.554868 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjtnh\" (UniqueName: \"kubernetes.io/projected/209c28b3-bf65-4d96-b67d-531e7463f2d5-kube-api-access-gjtnh\") pod \"node-resolver-cb9fx\" (UID: \"209c28b3-bf65-4d96-b67d-531e7463f2d5\") " pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.556951 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlb6h\" (UniqueName: \"kubernetes.io/projected/24c895f6-954c-4211-8957-0d888d862cfa-kube-api-access-mlb6h\") pod \"multus-additional-cni-plugins-bvt29\" (UID: \"24c895f6-954c-4211-8957-0d888d862cfa\") " pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.562703 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cb9fx" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.567802 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.578135 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-chwcj" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.581504 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.581544 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.581559 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.581580 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.581595 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.586835 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bvt29" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.593064 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: W0320 13:24:39.598697 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda4c21dd_2600_4141_bf05_7c18c1932a33.slice/crio-ceeacdc29113e758e434b1e850de5870eaf13a43d528bd93d7d633d6c89993c8 WatchSource:0}: Error finding container ceeacdc29113e758e434b1e850de5870eaf13a43d528bd93d7d633d6c89993c8: Status 404 returned error can't find the container with id ceeacdc29113e758e434b1e850de5870eaf13a43d528bd93d7d633d6c89993c8 Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.613685 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.614367 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dhzh4"] Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.615088 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9njpz"] Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.616915 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.616974 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.621252 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.621751 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.621831 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.621933 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.622482 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.622640 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.622771 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.622892 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.622917 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.623091 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.623222 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.626233 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.627606 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.659363 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.675370 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.684171 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.684220 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.684290 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.684317 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.684337 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.692259 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.706715 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.720421 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723049 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723083 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723101 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723120 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51a8789-c529-4a2c-b8f1-dc31a3c06403-proxy-tls\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723139 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723165 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51a8789-c529-4a2c-b8f1-dc31a3c06403-rootfs\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723185 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723222 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723245 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723263 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723312 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723328 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723346 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723375 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723406 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723424 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723440 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723455 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723472 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzmgq\" (UniqueName: \"kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723498 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqfcl\" (UniqueName: \"kubernetes.io/projected/e51a8789-c529-4a2c-b8f1-dc31a3c06403-kube-api-access-zqfcl\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723518 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723540 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51a8789-c529-4a2c-b8f1-dc31a3c06403-mcd-auth-proxy-config\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723556 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.723573 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.740311 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.751479 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.772914 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.786368 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.788338 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.788386 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.788399 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.788416 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.788428 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.798779 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.821134 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823884 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823913 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823931 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823949 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823973 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.823993 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824009 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824021 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824037 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzmgq\" (UniqueName: \"kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824051 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqfcl\" (UniqueName: \"kubernetes.io/projected/e51a8789-c529-4a2c-b8f1-dc31a3c06403-kube-api-access-zqfcl\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824065 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824082 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824096 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824109 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824125 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51a8789-c529-4a2c-b8f1-dc31a3c06403-mcd-auth-proxy-config\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824140 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824153 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824168 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51a8789-c529-4a2c-b8f1-dc31a3c06403-proxy-tls\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824182 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824196 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824210 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824224 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51a8789-c529-4a2c-b8f1-dc31a3c06403-rootfs\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824249 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824294 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824372 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824416 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824450 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824497 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824521 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824561 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824582 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.824999 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825033 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825050 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e51a8789-c529-4a2c-b8f1-dc31a3c06403-mcd-auth-proxy-config\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825057 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825086 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825415 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825530 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825633 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825670 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825695 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825706 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/e51a8789-c529-4a2c-b8f1-dc31a3c06403-rootfs\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825718 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.825717 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.831204 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e51a8789-c529-4a2c-b8f1-dc31a3c06403-proxy-tls\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.837660 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.838247 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.841609 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzmgq\" (UniqueName: \"kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq\") pod \"ovnkube-node-9njpz\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.842076 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqfcl\" (UniqueName: \"kubernetes.io/projected/e51a8789-c529-4a2c-b8f1-dc31a3c06403-kube-api-access-zqfcl\") pod \"machine-config-daemon-dhzh4\" (UID: \"e51a8789-c529-4a2c-b8f1-dc31a3c06403\") " pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.849150 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.890533 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.890587 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.890597 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.890610 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.890619 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.970470 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:39 crc kubenswrapper[4856]: W0320 13:24:39.985461 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a5ae28_8378_4545_af2d_cf1eb86364a2.slice/crio-eca97356b8fc80daef2b879effb2722a2305fc17f0b77afa02f334ce7f97f82d WatchSource:0}: Error finding container eca97356b8fc80daef2b879effb2722a2305fc17f0b77afa02f334ce7f97f82d: Status 404 returned error can't find the container with id eca97356b8fc80daef2b879effb2722a2305fc17f0b77afa02f334ce7f97f82d Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.986575 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.992127 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.992162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.992173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.992188 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:39 crc kubenswrapper[4856]: I0320 13:24:39.992196 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:39Z","lastTransitionTime":"2026-03-20T13:24:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: W0320 13:24:40.006579 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51a8789_c529_4a2c_b8f1_dc31a3c06403.slice/crio-10875e2873831af63f68995f275bdad60cb25a5f8e8c5ec2e8fc8808457e26fc WatchSource:0}: Error finding container 10875e2873831af63f68995f275bdad60cb25a5f8e8c5ec2e8fc8808457e26fc: Status 404 returned error can't find the container with id 10875e2873831af63f68995f275bdad60cb25a5f8e8c5ec2e8fc8808457e26fc Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.094363 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.094406 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.094417 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.094432 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.094442 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.197636 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.197712 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.197735 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.197764 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.197828 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.300066 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.300097 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.300105 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.300118 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.300126 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.402788 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.402854 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.402874 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.402901 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.402919 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.411293 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cb9fx" event={"ID":"209c28b3-bf65-4d96-b67d-531e7463f2d5","Type":"ContainerStarted","Data":"dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.411374 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cb9fx" event={"ID":"209c28b3-bf65-4d96-b67d-531e7463f2d5","Type":"ContainerStarted","Data":"8a8bfe030afd26619fd6fb486769e6ac267a23f1316b6c4ce2602fcd5f36eed1"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.413359 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.413405 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.413429 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"10875e2873831af63f68995f275bdad60cb25a5f8e8c5ec2e8fc8808457e26fc"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.419989 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" exitCode=0 Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.420162 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.420338 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"eca97356b8fc80daef2b879effb2722a2305fc17f0b77afa02f334ce7f97f82d"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.423050 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerStarted","Data":"de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.423088 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerStarted","Data":"ceeacdc29113e758e434b1e850de5870eaf13a43d528bd93d7d633d6c89993c8"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.428469 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b" exitCode=0 Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.428538 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.428577 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerStarted","Data":"7af7eb964700e7a7b7401d00ac561633780a71c6534032dfb4baf7bdf7c4c0fd"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.428899 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.444146 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.456200 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.471121 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.481764 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.506846 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.506881 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.506891 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.506906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.506916 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.507582 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.521859 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.537145 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.561948 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.577129 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.592628 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.608466 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.609862 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.609890 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.609921 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.609933 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.609942 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.623921 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.643699 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.659654 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.672089 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.685789 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.699182 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.712886 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.713038 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.713077 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.713092 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.713113 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.713128 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.724580 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.739562 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.750730 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.762570 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.786914 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.803102 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.814010 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.815352 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.815377 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.815388 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.815403 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.815418 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.819425 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.819480 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.819487 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:40 crc kubenswrapper[4856]: E0320 13:24:40.819555 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:40 crc kubenswrapper[4856]: E0320 13:24:40.819641 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:40 crc kubenswrapper[4856]: E0320 13:24:40.819989 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.829639 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.830763 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:40 crc kubenswrapper[4856]: E0320 13:24:40.831019 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.918334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.918367 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.918378 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.918401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:40 crc kubenswrapper[4856]: I0320 13:24:40.918413 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:40Z","lastTransitionTime":"2026-03-20T13:24:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.020771 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.020809 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.020821 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.020838 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.020850 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.123257 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.123545 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.123554 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.123566 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.123574 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.225937 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.225971 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.225981 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.225997 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.226008 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.328865 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.328899 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.328911 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.328926 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.328938 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.432135 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.432189 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.432206 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.432229 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.432247 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.435978 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348" exitCode=0 Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.436036 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.442834 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.443103 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.443383 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.443428 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.443451 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.443469 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.443486 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.453832 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.469791 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.502793 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.504463 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.504558 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.504577 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.504602 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.504619 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.529119 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.539350 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.543721 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.549341 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.549371 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.549379 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.549391 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.549400 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.562765 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.570129 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.570173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.570184 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.570202 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.570216 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.575390 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.585608 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.589471 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.590324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.590352 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.590363 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.590379 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.590391 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.608239 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.610064 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.613320 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.613358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.613367 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.613383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.613392 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.623630 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.626582 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: E0320 13:24:41.626740 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.628305 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.628335 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.628344 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.628357 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.628366 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.638631 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.649026 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.663763 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.677015 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.688764 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.730805 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.730859 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.730870 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.730887 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.730899 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.833289 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.833334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.833347 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.833364 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.833377 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.935733 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.935844 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.935870 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.935903 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:41 crc kubenswrapper[4856]: I0320 13:24:41.935929 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:41Z","lastTransitionTime":"2026-03-20T13:24:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.038083 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.038128 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.038143 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.038162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.038177 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.141040 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.141091 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.141103 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.141118 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.141127 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.233163 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-f9hch"] Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.233491 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.235630 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.236672 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.236739 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.236932 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.245412 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.245461 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.245472 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.245489 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.245504 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.254763 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.274197 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.294604 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.311124 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.326189 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.338391 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.348399 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.348450 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.348466 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.348487 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.348503 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.356055 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j6n9\" (UniqueName: \"kubernetes.io/projected/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-kube-api-access-8j6n9\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.356102 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-serviceca\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.356120 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-host\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.372239 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.402708 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.423193 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.439688 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.447008 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.448723 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6" exitCode=0 Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.448773 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.450851 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.450882 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.450895 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.450909 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.450921 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.456661 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j6n9\" (UniqueName: \"kubernetes.io/projected/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-kube-api-access-8j6n9\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.456729 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-serviceca\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.456764 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-host\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.456859 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-host\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.457729 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-serviceca\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.480375 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j6n9\" (UniqueName: \"kubernetes.io/projected/0cfc66d0-d1a1-43e7-809d-a36f9bb1750c-kube-api-access-8j6n9\") pod \"node-ca-f9hch\" (UID: \"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\") " pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.480356 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.501989 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.517260 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.531409 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.545221 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.552574 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-f9hch" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.553755 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.553797 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.553809 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.553825 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.553837 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.556352 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.568095 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: W0320 13:24:42.575439 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cfc66d0_d1a1_43e7_809d_a36f9bb1750c.slice/crio-69933c57f1983337974d6209d32a893b7beaf0392ca416c9acd1a8a820304083 WatchSource:0}: Error finding container 69933c57f1983337974d6209d32a893b7beaf0392ca416c9acd1a8a820304083: Status 404 returned error can't find the container with id 69933c57f1983337974d6209d32a893b7beaf0392ca416c9acd1a8a820304083 Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.579422 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.591843 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.602621 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.613484 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.623605 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.643365 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658555 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658600 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658615 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658633 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658646 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.658843 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.672950 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.691954 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.705357 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.720742 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.732327 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.743726 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:42Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.760194 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.760228 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.760236 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.760247 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.760256 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.819699 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.819700 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:42 crc kubenswrapper[4856]: E0320 13:24:42.819822 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.819720 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:42 crc kubenswrapper[4856]: E0320 13:24:42.819924 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:42 crc kubenswrapper[4856]: E0320 13:24:42.819988 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.862231 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.862280 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.862292 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.862313 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.862324 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.971490 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.971559 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.971576 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.971603 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:42 crc kubenswrapper[4856]: I0320 13:24:42.971622 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:42Z","lastTransitionTime":"2026-03-20T13:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.074073 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.074166 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.074184 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.074208 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.074225 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.177032 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.177097 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.177117 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.177147 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.177169 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.280021 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.280076 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.280090 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.280111 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.280127 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.382448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.382502 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.382526 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.382555 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.382574 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.454368 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f9hch" event={"ID":"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c","Type":"ContainerStarted","Data":"48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.454441 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-f9hch" event={"ID":"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c","Type":"ContainerStarted","Data":"69933c57f1983337974d6209d32a893b7beaf0392ca416c9acd1a8a820304083"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.460641 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f" exitCode=0 Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.460692 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.468663 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.485441 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.485472 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.485479 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.485494 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.485502 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.506030 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.520000 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.532008 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.545086 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.561082 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.574332 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.585876 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.588239 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.588320 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.588338 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.588361 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.588378 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.599347 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.613569 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.628758 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.642997 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.653543 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.663519 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.688053 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.691136 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.691154 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.691162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.691174 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.691183 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.700645 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.720750 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.737571 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.758103 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.770527 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.793569 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.793610 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.793621 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.793640 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.793651 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.796434 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.817021 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.833393 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.845750 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.856847 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.868333 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.891158 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.896073 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.896107 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.896119 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.896135 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.896146 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.907264 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.921133 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.931219 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.999373 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.999419 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.999435 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.999461 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:43 crc kubenswrapper[4856]: I0320 13:24:43.999477 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:43Z","lastTransitionTime":"2026-03-20T13:24:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.103376 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.103455 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.103481 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.103548 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.103577 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.206698 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.206760 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.206782 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.206810 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.206832 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.309445 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.309483 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.309495 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.309511 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.309523 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.412874 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.412933 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.412949 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.413383 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.413425 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.470424 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.475672 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932" exitCode=0 Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.475736 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.496671 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.516448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.516493 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.516509 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.516531 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.516547 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.526234 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.547895 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.561977 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.576763 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.589579 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.602069 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.617179 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.619830 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.619891 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.619911 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.619943 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.620012 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.630186 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.642485 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.654394 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.676947 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.691691 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.706102 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.722580 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.722628 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.722652 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.722684 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.722709 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.728167 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:44Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.819716 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.819727 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.819918 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.819732 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.820032 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.820201 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.826145 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.826194 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.826208 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.826229 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.826243 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.880993 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.881084 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.881119 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.881192 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.881203 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:00.881166131 +0000 UTC m=+115.762192291 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.881260 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:00.881238013 +0000 UTC m=+115.762264263 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.881264 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.881386 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:00.881366776 +0000 UTC m=+115.762392936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.930100 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.930146 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.930156 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.930174 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.930185 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:44Z","lastTransitionTime":"2026-03-20T13:24:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.982111 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:44 crc kubenswrapper[4856]: I0320 13:24:44.982216 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982367 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982406 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982423 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982486 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:00.982463861 +0000 UTC m=+115.863490061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982367 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982524 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982540 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:44 crc kubenswrapper[4856]: E0320 13:24:44.982594 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:00.982579724 +0000 UTC m=+115.863605934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.033332 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.033400 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.033418 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.033458 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.033480 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.136779 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.136866 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.136890 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.136918 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.136939 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.240363 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.240431 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.240448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.240474 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.240491 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.343631 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.343712 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.343738 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.343768 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.343792 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.446745 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.446800 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.446816 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.446838 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.446855 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.483378 4856 generic.go:334] "Generic (PLEG): container finished" podID="24c895f6-954c-4211-8957-0d888d862cfa" containerID="465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471" exitCode=0 Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.483430 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerDied","Data":"465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.504338 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.525327 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.539352 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.553026 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.553065 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.553078 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.553098 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.553111 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.556841 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.595539 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.615823 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.631525 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.645501 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.656905 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.656941 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.656951 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.656966 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.656977 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.657881 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.688078 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.705988 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.723657 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.737609 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.753163 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.759460 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.759490 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.759501 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.759517 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.759527 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.765482 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.828783 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.845144 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.858280 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.861311 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.861334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.861342 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.861355 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.861363 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.870939 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.883579 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.894252 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.912058 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.925479 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.935843 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.947464 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.959453 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.962867 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.962913 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.962924 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.962943 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.962955 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:45Z","lastTransitionTime":"2026-03-20T13:24:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.969365 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.979986 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:45 crc kubenswrapper[4856]: I0320 13:24:45.990695 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.003043 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.065554 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.065601 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.065613 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.065631 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.065644 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.167892 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.167923 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.167934 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.167948 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.167959 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.271103 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.271158 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.271178 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.271203 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.271221 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.375001 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.375060 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.375081 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.375109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.375131 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.478803 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.478863 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.478880 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.478906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.478922 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.495112 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" event={"ID":"24c895f6-954c-4211-8957-0d888d862cfa","Type":"ContainerStarted","Data":"22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.504498 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.504889 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.519231 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.551525 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.552255 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.574695 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.582603 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.582670 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.582694 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.582722 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.582746 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.596392 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.612655 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.639064 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.660365 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.676220 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.685768 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.685831 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.685845 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.685866 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.685926 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.691099 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.705120 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.719381 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.751260 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.770799 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.788897 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.788936 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.788953 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.789004 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.789021 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.792454 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.811598 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.819643 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:46 crc kubenswrapper[4856]: E0320 13:24:46.819813 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.820153 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.820199 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:46 crc kubenswrapper[4856]: E0320 13:24:46.820311 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:46 crc kubenswrapper[4856]: E0320 13:24:46.820429 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.827193 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.868066 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.885143 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.890956 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.890995 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.891007 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.891023 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.891033 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.904887 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.917289 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.936097 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.948765 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.959926 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.968507 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.979941 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.993573 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.993603 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.993614 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.993631 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.993643 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:46Z","lastTransitionTime":"2026-03-20T13:24:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:46 crc kubenswrapper[4856]: I0320 13:24:46.997584 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.009883 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.024094 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.041608 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.060734 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.101089 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.101150 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.101162 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.101181 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.101194 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.204357 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.204390 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.204401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.204419 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.204431 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.306721 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.306764 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.306779 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.306799 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.306814 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.409469 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.409495 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.409503 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.409515 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.409524 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.509163 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.509210 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.511587 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.511632 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.511643 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.511659 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.511672 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.544967 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.568774 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.599588 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.614476 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.614518 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.614536 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.614561 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.614581 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.615588 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.636210 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.658151 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.676602 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.689634 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.707154 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.716962 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.717010 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.717025 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.717045 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.717062 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.719317 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.733223 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.746294 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.758146 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.768831 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.798672 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821226 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821674 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821690 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821697 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821708 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.821716 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.924324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.924506 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.924520 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.924537 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:47 crc kubenswrapper[4856]: I0320 13:24:47.924549 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:47Z","lastTransitionTime":"2026-03-20T13:24:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.026750 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.026781 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.026790 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.026802 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.026810 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.129327 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.129366 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.129375 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.129390 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.129400 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.232320 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.232391 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.232401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.232416 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.232427 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.334145 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.334184 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.334194 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.334207 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.334216 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.436544 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.436582 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.436591 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.436604 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.436615 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.539438 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.539474 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.539485 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.539500 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.539513 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.642651 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.642710 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.642719 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.642733 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.642744 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.744810 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.744847 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.744857 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.744869 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.744878 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.819633 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.819694 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:48 crc kubenswrapper[4856]: E0320 13:24:48.819748 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.819765 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:48 crc kubenswrapper[4856]: E0320 13:24:48.819837 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:48 crc kubenswrapper[4856]: E0320 13:24:48.819902 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.847076 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.847109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.847116 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.847129 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.847138 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.949211 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.949241 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.949252 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.949289 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:48 crc kubenswrapper[4856]: I0320 13:24:48.949304 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:48Z","lastTransitionTime":"2026-03-20T13:24:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.051754 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.051794 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.051805 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.051821 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.051835 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.155529 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.155581 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.155593 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.155611 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.155623 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.266430 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.266508 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.266531 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.266563 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.266585 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.370527 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.370852 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.370986 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.371325 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.371572 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.475046 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.475087 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.475098 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.475114 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.475125 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.515507 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/0.log" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.517874 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968" exitCode=1 Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.517911 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.518716 4856 scope.go:117] "RemoveContainer" containerID="8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.534519 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.545618 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.556106 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.578016 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.578054 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.578068 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.578083 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.578093 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.583890 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.596099 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.611722 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.632553 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:49Z\\\",\\\"message\\\":\\\" 9\\\\nI0320 13:24:49.083940 6684 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:24:49.083973 6684 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:24:49.084075 6684 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:24:49.084100 6684 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:24:49.084135 6684 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:24:49.084147 6684 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:24:49.084183 6684 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:24:49.084210 6684 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:24:49.084216 6684 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:24:49.084225 6684 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:24:49.084236 6684 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:24:49.085554 6684 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:24:49.085593 6684 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:24:49.085631 6684 factory.go:656] Stopping watch factory\\\\nI0320 13:24:49.085649 6684 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:24:49.085646 6684 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.646605 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.662675 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.677947 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.680543 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.680563 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.680571 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.680583 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.680595 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.695716 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.705389 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.719618 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.731977 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.744075 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.782463 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.782502 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.782510 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.782524 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.782533 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.885230 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.885284 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.885296 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.885311 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.885320 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.987021 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.987060 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.987069 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.987084 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:49 crc kubenswrapper[4856]: I0320 13:24:49.987094 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:49Z","lastTransitionTime":"2026-03-20T13:24:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.088977 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.089036 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.089049 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.089073 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.089123 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.191498 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.191538 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.191550 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.191565 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.191575 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.293601 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.293643 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.293651 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.293666 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.293674 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.395951 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.395998 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.396009 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.396024 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.396035 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.499210 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.499311 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.499330 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.499356 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.499375 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.530325 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/1.log" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.531167 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/0.log" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.533891 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3" exitCode=1 Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.533943 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.533987 4856 scope.go:117] "RemoveContainer" containerID="8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.535069 4856 scope.go:117] "RemoveContainer" containerID="b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3" Mar 20 13:24:50 crc kubenswrapper[4856]: E0320 13:24:50.535326 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.549461 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.564437 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.578374 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.589996 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.602123 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.602166 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.602177 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.602193 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.602204 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.604754 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.625785 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.658687 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.674861 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.694199 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.705380 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.705453 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.705477 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.705506 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.705527 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.712263 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.734887 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8367443afa0146ba39bdddef219ca95eacee546c8668d7e71d7ca475175a9968\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:49Z\\\",\\\"message\\\":\\\" 9\\\\nI0320 13:24:49.083940 6684 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:24:49.083973 6684 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:24:49.084075 6684 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:24:49.084100 6684 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:24:49.084135 6684 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:24:49.084147 6684 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:24:49.084183 6684 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:24:49.084210 6684 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:24:49.084216 6684 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:24:49.084225 6684 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:24:49.084236 6684 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:24:49.085554 6684 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:24:49.085593 6684 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:24:49.085631 6684 factory.go:656] Stopping watch factory\\\\nI0320 13:24:49.085649 6684 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:24:49.085646 6684 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.757782 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.777035 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.792155 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.805772 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.807661 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.807708 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.807720 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.807740 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.807753 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.819188 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.819363 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:50 crc kubenswrapper[4856]: E0320 13:24:50.819482 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.819571 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:50 crc kubenswrapper[4856]: E0320 13:24:50.819675 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:50 crc kubenswrapper[4856]: E0320 13:24:50.819763 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.910321 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.910360 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.910368 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.910403 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:50 crc kubenswrapper[4856]: I0320 13:24:50.910412 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:50Z","lastTransitionTime":"2026-03-20T13:24:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.013102 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.013160 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.013173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.013190 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.013201 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.115499 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.115541 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.115556 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.115579 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.115596 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.217872 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.217970 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.217983 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.218008 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.218028 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.320755 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.320816 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.320840 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.320863 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.320881 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.423981 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.424064 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.424089 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.424122 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.424147 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.526341 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.526403 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.526421 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.526446 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.526463 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.545990 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/1.log" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.550182 4856 scope.go:117] "RemoveContainer" containerID="b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.550394 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.571140 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.585673 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.600944 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.613929 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.628831 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.630678 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.630719 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.630732 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.630753 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.630767 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.642587 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.658543 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.672110 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.684361 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.688448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.688481 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.688495 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.688515 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.688528 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.696393 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.705794 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.715052 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.715096 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.715111 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.715136 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.715150 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.720746 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.730435 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.735355 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.735401 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.735413 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.735428 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.735440 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.740855 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.754493 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.758757 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.759606 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.759643 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.759654 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.759669 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.759680 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.773756 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.773755 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.777502 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.777541 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.777551 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.777569 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.777579 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.792155 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: E0320 13:24:51.792430 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.794187 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.794245 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.794315 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.794360 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.794385 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.799803 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.897997 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.898050 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.898062 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.898081 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:51 crc kubenswrapper[4856]: I0320 13:24:51.898093 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:51Z","lastTransitionTime":"2026-03-20T13:24:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.001554 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.001645 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.001685 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.001729 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.001758 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.048248 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88"] Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.049174 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.052432 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.053427 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.073412 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.091389 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.104805 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.104883 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.104902 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.104931 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.104954 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.111308 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.131628 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.149671 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.151783 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.151906 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.151960 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt2m5\" (UniqueName: \"kubernetes.io/projected/2b79d345-85f7-40ac-beef-d6d9083fc195-kube-api-access-tt2m5\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.152037 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b79d345-85f7-40ac-beef-d6d9083fc195-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.167665 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.177626 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.189921 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.201909 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.206922 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.206971 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.206989 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.207009 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.207025 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.223691 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.239999 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.252909 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.253158 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b79d345-85f7-40ac-beef-d6d9083fc195-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.253238 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.253262 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.253316 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt2m5\" (UniqueName: \"kubernetes.io/projected/2b79d345-85f7-40ac-beef-d6d9083fc195-kube-api-access-tt2m5\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.254060 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-env-overrides\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.254127 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2b79d345-85f7-40ac-beef-d6d9083fc195-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.261910 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2b79d345-85f7-40ac-beef-d6d9083fc195-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.266675 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.271923 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt2m5\" (UniqueName: \"kubernetes.io/projected/2b79d345-85f7-40ac-beef-d6d9083fc195-kube-api-access-tt2m5\") pod \"ovnkube-control-plane-749d76644c-n6c88\" (UID: \"2b79d345-85f7-40ac-beef-d6d9083fc195\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.282209 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.298568 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.307925 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.308914 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.308944 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.308952 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.308968 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.308977 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.371713 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.417310 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.417387 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.417413 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.417444 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.417467 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.521516 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.521581 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.521604 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.522111 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.522222 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.552782 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" event={"ID":"2b79d345-85f7-40ac-beef-d6d9083fc195","Type":"ContainerStarted","Data":"3437f5712d0bb1f1d21164d8b277448e1525eab01166099394ffd54855636916"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.624719 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.624757 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.624766 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.624781 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.624794 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.727208 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.727254 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.727324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.727345 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.727356 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.819367 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.819384 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:52 crc kubenswrapper[4856]: E0320 13:24:52.819512 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.819533 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:52 crc kubenswrapper[4856]: E0320 13:24:52.819608 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:52 crc kubenswrapper[4856]: E0320 13:24:52.819661 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.829989 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.830016 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.830026 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.830037 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.830045 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.932068 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.932114 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.932129 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.932150 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:52 crc kubenswrapper[4856]: I0320 13:24:52.932163 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:52Z","lastTransitionTime":"2026-03-20T13:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.035004 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.035041 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.035049 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.035063 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.035073 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.137678 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.137730 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.137746 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.137768 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.137785 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.241470 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.241533 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.241549 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.241575 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.241592 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.345131 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.345215 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.345243 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.345306 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.345331 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.448215 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.448299 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.448317 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.448341 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.448358 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.521892 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qtlvp"] Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.522766 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: E0320 13:24:53.522910 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.542686 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.551650 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.551700 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.551718 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.551742 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.551761 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.558315 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" event={"ID":"2b79d345-85f7-40ac-beef-d6d9083fc195","Type":"ContainerStarted","Data":"3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.558379 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" event={"ID":"2b79d345-85f7-40ac-beef-d6d9083fc195","Type":"ContainerStarted","Data":"8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.564372 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.567609 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46gr4\" (UniqueName: \"kubernetes.io/projected/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-kube-api-access-46gr4\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.567668 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.577452 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.593698 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.609435 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.641938 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.655052 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.655133 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.655161 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.655193 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.655216 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.665152 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.668767 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46gr4\" (UniqueName: \"kubernetes.io/projected/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-kube-api-access-46gr4\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.668819 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: E0320 13:24:53.668960 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:53 crc kubenswrapper[4856]: E0320 13:24:53.669031 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.169006757 +0000 UTC m=+109.050032907 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.685616 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.698617 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46gr4\" (UniqueName: \"kubernetes.io/projected/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-kube-api-access-46gr4\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.701696 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.721699 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.744252 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758228 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758330 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758356 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758386 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758348 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.758410 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.778978 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.795316 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.809022 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.828595 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.840181 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.855945 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.860961 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.861000 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.861011 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.861026 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.861037 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.871605 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.885445 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.905533 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.920574 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.939379 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.953059 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.963358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.963423 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.963440 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.963464 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.963482 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:53Z","lastTransitionTime":"2026-03-20T13:24:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.972166 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:53 crc kubenswrapper[4856]: I0320 13:24:53.987308 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.001723 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:53Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.035962 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.066182 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.066229 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.066245 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.066293 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.066310 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.067509 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.084578 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.095591 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.108901 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.129043 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.140728 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.169971 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.170324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.170519 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.170661 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.170793 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.174559 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.174696 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.174784 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:24:55.174763869 +0000 UTC m=+110.055790009 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.274760 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.274825 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.274847 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.274873 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.274891 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.377477 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.377971 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.378213 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.378399 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.378574 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.482157 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.482551 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.482710 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.482898 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.483117 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.585578 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.586309 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.586432 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.586561 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.586691 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.689591 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.689670 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.689693 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.689722 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.689745 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.793514 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.793564 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.793579 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.793595 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.793606 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.819341 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.819390 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.819527 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.819646 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.819740 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.819930 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.820080 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:54 crc kubenswrapper[4856]: E0320 13:24:54.820298 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.820446 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.895521 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.895588 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.895601 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.895618 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:54 crc kubenswrapper[4856]: I0320 13:24:54.895630 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:54Z","lastTransitionTime":"2026-03-20T13:24:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.001974 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.002034 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.002047 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.002070 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.002086 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.106009 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.106062 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.106075 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.106093 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.106105 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.184335 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:55 crc kubenswrapper[4856]: E0320 13:24:55.184495 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:55 crc kubenswrapper[4856]: E0320 13:24:55.184558 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:24:57.184541648 +0000 UTC m=+112.065567788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.208664 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.208749 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.208772 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.208796 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.208814 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.310775 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.310814 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.310825 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.310839 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.310849 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.412583 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.412619 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.412627 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.412641 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.412667 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.515537 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.515578 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.515586 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.515600 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.515609 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.568015 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.570949 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.571520 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.592004 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.611209 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.619053 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.619082 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.619094 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.619109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.619121 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.625751 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.644101 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.656218 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.682962 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.699546 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.711352 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.720315 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.721836 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.721857 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.721866 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.721878 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.721888 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.735852 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.768380 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.784379 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.799665 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.814863 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.823556 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.823593 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.823603 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.823617 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.823626 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.830440 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.844475 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.853850 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.864989 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.887816 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.905538 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.923137 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.925820 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.925855 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.925868 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.925884 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.925894 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:55Z","lastTransitionTime":"2026-03-20T13:24:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.934170 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.950429 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.968225 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.979534 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:55 crc kubenswrapper[4856]: I0320 13:24:55.991236 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.004875 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.019979 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.028203 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.028254 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.028280 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.028296 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.028308 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.037117 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.051733 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.064385 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.076229 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.085935 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.098073 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:24:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.129838 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.129871 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.129878 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.129891 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.129902 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.233117 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.233171 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.233187 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.233210 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.233225 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.336110 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.336168 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.336184 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.336231 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.336248 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.439888 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.439958 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.439984 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.440019 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.440044 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.542763 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.542835 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.542858 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.542889 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.542912 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.645499 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.645553 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.645576 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.645624 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.645650 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.747930 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.747970 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.747984 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.748000 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.748011 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.819084 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.819255 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:56 crc kubenswrapper[4856]: E0320 13:24:56.819347 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.819381 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.819529 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:56 crc kubenswrapper[4856]: E0320 13:24:56.819535 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:24:56 crc kubenswrapper[4856]: E0320 13:24:56.819595 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:56 crc kubenswrapper[4856]: E0320 13:24:56.819717 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.850173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.850224 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.850234 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.850247 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.850257 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.952705 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.952734 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.952743 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.952771 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:56 crc kubenswrapper[4856]: I0320 13:24:56.952780 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:56Z","lastTransitionTime":"2026-03-20T13:24:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.054845 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.054906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.054923 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.054946 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.054968 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.158318 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.158379 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.158391 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.158409 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.158421 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.207919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:57 crc kubenswrapper[4856]: E0320 13:24:57.208327 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:57 crc kubenswrapper[4856]: E0320 13:24:57.208726 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:25:01.208694588 +0000 UTC m=+116.089720758 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.261150 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.261188 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.261209 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.261224 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.261233 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.363814 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.363873 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.363896 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.363918 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.363933 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.466903 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.466968 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.466989 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.467016 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.467034 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.570784 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.570851 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.570873 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.570902 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.570924 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.673427 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.673503 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.673521 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.673546 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.673565 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.776550 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.776631 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.776654 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.776699 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.776722 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.879517 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.879590 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.879745 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.879780 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.879819 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.982849 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.982907 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.982932 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.982954 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:57 crc kubenswrapper[4856]: I0320 13:24:57.982972 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:57Z","lastTransitionTime":"2026-03-20T13:24:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.086313 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.086382 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.086406 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.086436 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.086460 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.191102 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.191178 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.191201 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.191233 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.191258 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.294928 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.294992 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.295015 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.295040 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.295058 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.398326 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.398527 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.398562 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.398612 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.398635 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.501139 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.501206 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.501224 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.501252 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.501376 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.604373 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.604419 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.604436 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.604460 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.604477 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.707611 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.707676 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.707697 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.707729 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.707752 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.811061 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.811110 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.811121 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.811138 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.811153 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.819605 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.819636 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.819634 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:58 crc kubenswrapper[4856]: E0320 13:24:58.819739 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.819818 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:24:58 crc kubenswrapper[4856]: E0320 13:24:58.819918 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:58 crc kubenswrapper[4856]: E0320 13:24:58.820076 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:24:58 crc kubenswrapper[4856]: E0320 13:24:58.820205 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.913585 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.913666 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.913683 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.913708 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:58 crc kubenswrapper[4856]: I0320 13:24:58.913726 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:58Z","lastTransitionTime":"2026-03-20T13:24:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.016068 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.016128 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.016137 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.016150 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.016159 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.118864 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.118902 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.118912 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.118927 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.118938 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.221322 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.221350 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.221358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.221371 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.221378 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.324916 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.324982 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.325006 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.325036 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.325057 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.428503 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.428570 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.428587 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.428612 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.428632 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.531392 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.531431 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.531443 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.531458 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.531468 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.633354 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.633457 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.633479 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.633503 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.633523 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.736420 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.736467 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.736481 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.736501 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.736516 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.840253 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.840315 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.840326 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.840345 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.840357 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.944226 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.944290 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.944309 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.944333 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:24:59 crc kubenswrapper[4856]: I0320 13:24:59.944348 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:24:59Z","lastTransitionTime":"2026-03-20T13:24:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.047858 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.048340 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.048367 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.048397 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.048421 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.151380 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.151437 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.151459 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.151483 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.151499 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.254470 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.254532 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.254553 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.254581 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.254601 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.358058 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.358118 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.358134 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.358165 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.358183 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.460830 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.461096 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.461167 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.461302 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.461383 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.563811 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.563856 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.563870 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.563890 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.563902 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.666759 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.666800 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.666810 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.666826 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.666841 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.768759 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.768798 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.768808 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.768822 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.768831 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.818774 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.818962 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.819025 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.819178 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.819192 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.819360 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.819383 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.819597 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.871444 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.871473 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.871482 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.871494 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.871503 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.945579 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.945741 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:25:32.94571244 +0000 UTC m=+147.826738570 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.945840 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.945874 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.945971 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.946009 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:32.946002258 +0000 UTC m=+147.827028488 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.946106 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:00 crc kubenswrapper[4856]: E0320 13:25:00.946240 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:32.946205903 +0000 UTC m=+147.827232073 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.973811 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.973867 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.973884 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.973906 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:00 crc kubenswrapper[4856]: I0320 13:25:00.973923 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:00Z","lastTransitionTime":"2026-03-20T13:25:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.047015 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.047143 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047302 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047348 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047369 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047372 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047406 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047418 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047448 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:33.047426462 +0000 UTC m=+147.928452632 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.047477 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:25:33.047464563 +0000 UTC m=+147.928490733 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.076876 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.076939 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.076956 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.076981 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.076998 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.180175 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.180258 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.180346 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.180371 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.180390 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.249826 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.250223 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: E0320 13:25:01.250475 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:25:09.250432654 +0000 UTC m=+124.131458784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.282856 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.282918 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.282935 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.282958 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.282974 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.385635 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.385679 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.385691 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.385708 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.385719 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.488058 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.488135 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.488159 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.488185 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.488200 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.591361 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.591427 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.591444 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.591467 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.591484 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.693852 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.693904 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.693921 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.693944 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.693961 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.797578 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.797620 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.797628 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.797643 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.797660 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.900167 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.900210 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.900221 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.900236 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:01 crc kubenswrapper[4856]: I0320 13:25:01.900246 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:01Z","lastTransitionTime":"2026-03-20T13:25:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.003833 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.003903 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.003922 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.003949 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.003966 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.106703 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.106737 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.106746 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.106759 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.106768 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.133772 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.133813 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.133824 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.133840 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.133853 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.154156 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:02Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.158173 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.158220 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.158232 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.158249 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.158261 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.177332 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:02Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.181319 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.181400 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.181423 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.181456 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.181479 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.202808 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:02Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.207738 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.207793 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.207809 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.207828 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.207841 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.223897 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:02Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.228227 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.228324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.228351 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.228381 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.228406 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.246072 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:02Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.246339 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.248470 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.248527 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.248568 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.248620 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.248643 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.353061 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.353158 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.353183 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.353216 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.353253 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.456936 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.456969 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.456978 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.456994 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.457003 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.559533 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.559625 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.559648 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.559671 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.559692 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.662751 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.662811 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.662829 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.662851 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.662868 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.765441 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.765500 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.765522 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.765545 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.765562 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.819822 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.819837 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.819924 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.819982 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.820401 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.820547 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.820693 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:02 crc kubenswrapper[4856]: E0320 13:25:02.820868 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.833833 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.868825 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.868898 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.868921 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.868948 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.868965 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.972072 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.972154 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.972183 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.972207 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:02 crc kubenswrapper[4856]: I0320 13:25:02.972224 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:02Z","lastTransitionTime":"2026-03-20T13:25:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.075463 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.075528 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.075551 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.075579 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.075600 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.178298 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.178354 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.178370 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.178393 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.178410 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.280913 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.281020 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.281047 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.281079 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.281102 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.384003 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.384041 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.384050 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.384064 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.384073 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.487325 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.487428 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.487451 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.487478 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.487501 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.589656 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.589685 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.589693 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.589705 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.589713 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.693300 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.693334 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.693346 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.693361 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.693372 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.796600 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.796662 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.796672 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.796686 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.796694 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.819767 4856 scope.go:117] "RemoveContainer" containerID="b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.899846 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.899888 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.899900 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.899916 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:03 crc kubenswrapper[4856]: I0320 13:25:03.899929 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:03Z","lastTransitionTime":"2026-03-20T13:25:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.002235 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.002318 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.002336 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.002358 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.002376 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.104776 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.104814 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.104823 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.104839 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.104849 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.206848 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.206896 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.206910 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.206928 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.206943 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.310234 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.310294 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.310318 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.310339 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.310353 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.413394 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.413436 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.413447 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.413463 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.413473 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.516300 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.516333 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.516344 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.516363 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.516374 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.603130 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/1.log" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.606348 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.607196 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618121 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618148 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618156 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618168 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618177 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.618973 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.630502 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.640851 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.654310 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.668079 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.680707 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.695842 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.705812 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.717152 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.720800 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.720832 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.720840 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.720853 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.720862 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.737606 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.747618 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.758215 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.776543 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.788230 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.800486 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.813326 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.818943 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.818965 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.819010 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.819027 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:04 crc kubenswrapper[4856]: E0320 13:25:04.819066 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:04 crc kubenswrapper[4856]: E0320 13:25:04.819135 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:04 crc kubenswrapper[4856]: E0320 13:25:04.819218 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:04 crc kubenswrapper[4856]: E0320 13:25:04.819341 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824142 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824176 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824187 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824203 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824214 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.824109 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.837590 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:04Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.926961 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.927220 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.927374 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.927498 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:04 crc kubenswrapper[4856]: I0320 13:25:04.927741 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:04Z","lastTransitionTime":"2026-03-20T13:25:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.030077 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.030109 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.030118 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.030131 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.030140 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.133610 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.133645 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.133654 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.133671 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.133686 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.236787 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.237167 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.237348 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.237552 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.237723 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.340538 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.340905 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.341149 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.341393 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.341572 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.444565 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.444617 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.444635 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.444658 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.444675 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.547540 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.547585 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.547602 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.547625 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.547641 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.613244 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/2.log" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.614655 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/1.log" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.619737 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" exitCode=1 Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.619800 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.619872 4856 scope.go:117] "RemoveContainer" containerID="b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.620960 4856 scope.go:117] "RemoveContainer" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" Mar 20 13:25:05 crc kubenswrapper[4856]: E0320 13:25:05.621212 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.639359 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.652831 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.652887 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.652908 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.652937 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.652958 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:05Z","lastTransitionTime":"2026-03-20T13:25:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.670824 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.694578 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.714746 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.737503 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: E0320 13:25:05.753747 4856 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.761871 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.793778 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.817890 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.838571 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.854199 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.869221 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.893406 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.910022 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.933174 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: E0320 13:25:05.945999 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.952044 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.966403 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.979199 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:05 crc kubenswrapper[4856]: I0320 13:25:05.995931 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:05Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.011669 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.030573 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b34256d646e756dc34337d720daabb32670e503e4b5db0e5f9afc1f5ffd821c3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:24:50Z\\\",\\\"message\\\":\\\"le-node-developer:true service.alpha.openshift.io/serving-cert-secret-name:marketplace-operator-metrics service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{config.openshift.io/v1 ClusterVersion version 9101b518-476b-4eea-8fa6-69b0534e5caa 0xc0071ec517 \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:metrics,Protocol:TCP,Port:8383,TargetPort:{0 8383 },NodePort:0,AppProtocol:nil,},ServicePort{Name:https-metrics,Protocol:TCP,Port:8081,TargetPort:{0 8081 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{name: marketplace-operator,},ClusterIP:10.217.5.53,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.5.53],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0320 13:24:50.294127 6849 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for ne\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.045653 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.060346 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.077723 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.096855 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.118369 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.134799 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.155912 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.172128 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.184633 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.198705 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.214725 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.227973 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.251647 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.270659 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.290843 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.302214 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.626601 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/2.log" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.630297 4856 scope.go:117] "RemoveContainer" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" Mar 20 13:25:06 crc kubenswrapper[4856]: E0320 13:25:06.630484 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.643692 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.657772 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.674153 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.691334 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.705639 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.726069 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.742749 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.758631 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.769818 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.801776 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.815357 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.818974 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.819101 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.819019 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.819020 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:06 crc kubenswrapper[4856]: E0320 13:25:06.819218 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:06 crc kubenswrapper[4856]: E0320 13:25:06.819399 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:06 crc kubenswrapper[4856]: E0320 13:25:06.819510 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:06 crc kubenswrapper[4856]: E0320 13:25:06.819563 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.827896 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.849794 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.863952 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.875633 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.889811 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.905192 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:06 crc kubenswrapper[4856]: I0320 13:25:06.921627 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:06Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:08 crc kubenswrapper[4856]: I0320 13:25:08.819288 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:08 crc kubenswrapper[4856]: I0320 13:25:08.819324 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:08 crc kubenswrapper[4856]: I0320 13:25:08.819396 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:08 crc kubenswrapper[4856]: E0320 13:25:08.819418 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:08 crc kubenswrapper[4856]: I0320 13:25:08.819482 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:08 crc kubenswrapper[4856]: E0320 13:25:08.819590 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:08 crc kubenswrapper[4856]: E0320 13:25:08.819653 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:08 crc kubenswrapper[4856]: E0320 13:25:08.819741 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:09 crc kubenswrapper[4856]: I0320 13:25:09.331063 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:09 crc kubenswrapper[4856]: E0320 13:25:09.331239 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:09 crc kubenswrapper[4856]: E0320 13:25:09.331380 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:25:25.331355314 +0000 UTC m=+140.212381484 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:10 crc kubenswrapper[4856]: I0320 13:25:10.819510 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:10 crc kubenswrapper[4856]: I0320 13:25:10.819565 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:10 crc kubenswrapper[4856]: I0320 13:25:10.819539 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:10 crc kubenswrapper[4856]: I0320 13:25:10.819527 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:10 crc kubenswrapper[4856]: E0320 13:25:10.819709 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:10 crc kubenswrapper[4856]: E0320 13:25:10.819799 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:10 crc kubenswrapper[4856]: E0320 13:25:10.819876 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:10 crc kubenswrapper[4856]: E0320 13:25:10.820088 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:10 crc kubenswrapper[4856]: E0320 13:25:10.947550 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.337767 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.337834 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.337852 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.337875 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.337895 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:12Z","lastTransitionTime":"2026-03-20T13:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.359845 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.364655 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.364748 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.364766 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.364792 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.364812 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:12Z","lastTransitionTime":"2026-03-20T13:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.385051 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.396524 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.396615 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.396640 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.396671 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.396691 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:12Z","lastTransitionTime":"2026-03-20T13:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.419319 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.425734 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.425772 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.425784 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.425801 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.425815 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:12Z","lastTransitionTime":"2026-03-20T13:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.442999 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.447865 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.447911 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.447930 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.447952 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.447969 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:12Z","lastTransitionTime":"2026-03-20T13:25:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.462369 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.462790 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.819674 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.819724 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.819707 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:12 crc kubenswrapper[4856]: I0320 13:25:12.819802 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.819868 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.820041 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.820164 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:12 crc kubenswrapper[4856]: E0320 13:25:12.820403 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.829939 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.898882 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.919416 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.935322 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.952304 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.963974 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.975935 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:13 crc kubenswrapper[4856]: I0320 13:25:13.986500 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.001837 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.014035 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.030746 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.044241 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.055518 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.075933 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.087695 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.098152 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.108689 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.119766 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.131085 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.147231 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.160518 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.819240 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.819361 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.819429 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:14 crc kubenswrapper[4856]: I0320 13:25:14.819464 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:14 crc kubenswrapper[4856]: E0320 13:25:14.819633 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:14 crc kubenswrapper[4856]: E0320 13:25:14.819828 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:14 crc kubenswrapper[4856]: E0320 13:25:14.819958 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:14 crc kubenswrapper[4856]: E0320 13:25:14.820145 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.832164 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.850325 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.865992 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.879056 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.895328 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.906681 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:15 crc kubenswrapper[4856]: E0320 13:25:15.948497 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:15 crc kubenswrapper[4856]: I0320 13:25:15.982449 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.003808 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.015795 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.024714 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.034770 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.053222 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.063800 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.073075 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.083304 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.095639 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.108233 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.127933 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.144503 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.819332 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.819383 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.819449 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:16 crc kubenswrapper[4856]: E0320 13:25:16.819694 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:16 crc kubenswrapper[4856]: E0320 13:25:16.819844 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:16 crc kubenswrapper[4856]: E0320 13:25:16.819947 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:16 crc kubenswrapper[4856]: I0320 13:25:16.820067 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:16 crc kubenswrapper[4856]: E0320 13:25:16.820146 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:18 crc kubenswrapper[4856]: I0320 13:25:18.819849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:18 crc kubenswrapper[4856]: I0320 13:25:18.819857 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:18 crc kubenswrapper[4856]: E0320 13:25:18.820046 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:18 crc kubenswrapper[4856]: I0320 13:25:18.819884 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:18 crc kubenswrapper[4856]: I0320 13:25:18.819879 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:18 crc kubenswrapper[4856]: E0320 13:25:18.820130 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:18 crc kubenswrapper[4856]: E0320 13:25:18.820144 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:18 crc kubenswrapper[4856]: E0320 13:25:18.820300 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:19 crc kubenswrapper[4856]: I0320 13:25:19.820871 4856 scope.go:117] "RemoveContainer" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" Mar 20 13:25:19 crc kubenswrapper[4856]: E0320 13:25:19.821205 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:20 crc kubenswrapper[4856]: I0320 13:25:20.820106 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:20 crc kubenswrapper[4856]: E0320 13:25:20.820424 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:20 crc kubenswrapper[4856]: I0320 13:25:20.820549 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:20 crc kubenswrapper[4856]: E0320 13:25:20.820799 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:20 crc kubenswrapper[4856]: I0320 13:25:20.820979 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:20 crc kubenswrapper[4856]: E0320 13:25:20.821097 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:20 crc kubenswrapper[4856]: I0320 13:25:20.821161 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:20 crc kubenswrapper[4856]: E0320 13:25:20.821307 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:20 crc kubenswrapper[4856]: E0320 13:25:20.949432 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.641720 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.641765 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.641776 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.641791 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.641802 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:22Z","lastTransitionTime":"2026-03-20T13:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.655330 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:22Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.658892 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.658936 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.658949 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.658965 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.658976 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:22Z","lastTransitionTime":"2026-03-20T13:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.670806 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:22Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.675152 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.675190 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.675203 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.675217 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.675225 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:22Z","lastTransitionTime":"2026-03-20T13:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.688533 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:22Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.693153 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.693198 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.693211 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.693228 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.693240 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:22Z","lastTransitionTime":"2026-03-20T13:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.705904 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:22Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.709518 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.709561 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.709577 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.709591 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.709604 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:22Z","lastTransitionTime":"2026-03-20T13:25:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.723079 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:22Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.723238 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.819353 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.819470 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.819473 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:22 crc kubenswrapper[4856]: I0320 13:25:22.819474 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.819608 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.819719 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.819860 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:22 crc kubenswrapper[4856]: E0320 13:25:22.819962 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:24 crc kubenswrapper[4856]: I0320 13:25:24.819463 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:24 crc kubenswrapper[4856]: I0320 13:25:24.819517 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:24 crc kubenswrapper[4856]: I0320 13:25:24.819537 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:24 crc kubenswrapper[4856]: I0320 13:25:24.819482 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:24 crc kubenswrapper[4856]: E0320 13:25:24.819716 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:24 crc kubenswrapper[4856]: E0320 13:25:24.819978 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:24 crc kubenswrapper[4856]: E0320 13:25:24.820215 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:24 crc kubenswrapper[4856]: E0320 13:25:24.820357 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.413929 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:25 crc kubenswrapper[4856]: E0320 13:25:25.414059 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:25 crc kubenswrapper[4856]: E0320 13:25:25.414130 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:25:57.414114615 +0000 UTC m=+172.295140755 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.839090 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.862408 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.881898 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.899539 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.913373 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.925551 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.937115 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: E0320 13:25:25.950410 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.964179 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.979411 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:25 crc kubenswrapper[4856]: I0320 13:25:25.991962 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.002763 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.023837 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.035170 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.050147 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.060854 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.072262 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.084703 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.098853 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.111208 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.696069 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/0.log" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.696123 4856 generic.go:334] "Generic (PLEG): container finished" podID="da4c21dd-2600-4141-bf05-7c18c1932a33" containerID="de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641" exitCode=1 Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.696154 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerDied","Data":"de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641"} Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.696950 4856 scope.go:117] "RemoveContainer" containerID="de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.714919 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.727192 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.738838 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.751993 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.764068 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.777059 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.795558 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.809786 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.819772 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:26 crc kubenswrapper[4856]: E0320 13:25:26.819919 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.820042 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:26 crc kubenswrapper[4856]: E0320 13:25:26.820209 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.820052 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:26 crc kubenswrapper[4856]: E0320 13:25:26.820661 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.820787 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:26 crc kubenswrapper[4856]: E0320 13:25:26.820892 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.828137 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.838256 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.848766 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.873765 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.886321 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.899018 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.913117 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.927390 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.940876 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.955437 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:26 crc kubenswrapper[4856]: I0320 13:25:26.965244 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:26Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.703140 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/0.log" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.703220 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerStarted","Data":"2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608"} Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.723498 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.741387 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.756580 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.768909 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.784106 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.799951 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.824411 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.845223 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.859727 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.871171 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.889146 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.911199 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.926906 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.940888 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.958600 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.978368 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:27 crc kubenswrapper[4856]: I0320 13:25:27.999666 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:27Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.019394 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:28Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.045006 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:28Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.819506 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.819554 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.819619 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:28 crc kubenswrapper[4856]: I0320 13:25:28.819666 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:28 crc kubenswrapper[4856]: E0320 13:25:28.819888 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:28 crc kubenswrapper[4856]: E0320 13:25:28.820037 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:28 crc kubenswrapper[4856]: E0320 13:25:28.820140 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:28 crc kubenswrapper[4856]: E0320 13:25:28.820321 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:30 crc kubenswrapper[4856]: I0320 13:25:30.819439 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:30 crc kubenswrapper[4856]: I0320 13:25:30.819596 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:30 crc kubenswrapper[4856]: I0320 13:25:30.819738 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:30 crc kubenswrapper[4856]: E0320 13:25:30.819741 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:30 crc kubenswrapper[4856]: I0320 13:25:30.819780 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:30 crc kubenswrapper[4856]: E0320 13:25:30.819892 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:30 crc kubenswrapper[4856]: E0320 13:25:30.820056 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:30 crc kubenswrapper[4856]: E0320 13:25:30.820198 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:30 crc kubenswrapper[4856]: E0320 13:25:30.951637 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.735689 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.735762 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.735782 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.735808 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.735826 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:32Z","lastTransitionTime":"2026-03-20T13:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.758325 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.764324 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.764389 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.764413 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.764441 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.764463 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:32Z","lastTransitionTime":"2026-03-20T13:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.785464 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.790645 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.790903 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.791086 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.791233 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.791417 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:32Z","lastTransitionTime":"2026-03-20T13:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.812155 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.817440 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.817551 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.817575 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.817602 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.817623 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:32Z","lastTransitionTime":"2026-03-20T13:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.819127 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.819182 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.819308 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.819133 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.819710 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.819869 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.819705 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.820490 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.820924 4856 scope.go:117] "RemoveContainer" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.840113 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.849645 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.849713 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.849738 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.849769 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:32 crc kubenswrapper[4856]: I0320 13:25:32.849793 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:32Z","lastTransitionTime":"2026-03-20T13:25:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.874856 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:32 crc kubenswrapper[4856]: E0320 13:25:32.875101 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.008786 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.008912 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.008966 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.009181 4856 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.009254 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.009232784 +0000 UTC m=+211.890258954 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.009545 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.009527722 +0000 UTC m=+211.890553882 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.009658 4856 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.009899 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.009855831 +0000 UTC m=+211.890881991 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.110229 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.110591 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.110650 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.110673 4856 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.110726 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.110777 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.110740867 +0000 UTC m=+211.991767147 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.111069 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.111113 4856 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.111137 4856 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:33 crc kubenswrapper[4856]: E0320 13:25:33.111226 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.111195409 +0000 UTC m=+211.992221579 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.729624 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/2.log" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.735311 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.735918 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.757383 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.774512 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.796091 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.810947 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.823500 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.848248 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.862874 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.878429 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.891672 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.911347 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.926151 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.943024 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.956790 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.971035 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:33 crc kubenswrapper[4856]: I0320 13:25:33.989897 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.004449 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.014870 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.032863 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.052508 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.740525 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/3.log" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.741223 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/2.log" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.744842 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" exitCode=1 Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.744917 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.744980 4856 scope.go:117] "RemoveContainer" containerID="ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.745546 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:25:34 crc kubenswrapper[4856]: E0320 13:25:34.745721 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.767412 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.801891 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.819874 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.819904 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.819974 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.820108 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:34 crc kubenswrapper[4856]: E0320 13:25:34.820428 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:34 crc kubenswrapper[4856]: E0320 13:25:34.820515 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.820391 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: E0320 13:25:34.820577 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:34 crc kubenswrapper[4856]: E0320 13:25:34.820336 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.840916 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.858012 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.871720 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.899735 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ec4f61c065434f9a90bb8aca6d6c2ea1c3d2d8c26745a2f7c394aedb6096eda0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:04Z\\\",\\\"message\\\":\\\"e *v1.Pod\\\\nI0320 13:25:04.969675 7085 obj_retry.go:409] Going to retry *v1.Pod resource setup for 2 objects: [openshift-multus/network-metrics-daemon-qtlvp openshift-multus/multus-chwcj]\\\\nI0320 13:25:04.969678 7085 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 13:25:04.969687 7085 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 13:25:04.969704 7085 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969717 7085 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-chwcj\\\\nI0320 13:25:04.969728 7085 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-chwcj in node crc\\\\nI0320 13:25:04.969736 7085 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-chwcj after 0 failed attempt(s)\\\\nF0320 13:25:04.969736 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controlle\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:04Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:33Z\\\",\\\"message\\\":\\\"controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:25:33.806546 7410 services_controller.go:445] Built service openshift-operator-lifecycle-manager/packageserver-service LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 13:25:33.806553 7410 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:25:33.806559 7410 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.915773 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.933648 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.956964 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.975819 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:34 crc kubenswrapper[4856]: I0320 13:25:34.996734 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.013624 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.024533 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.042790 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.058795 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.072329 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.082770 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.095366 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.756003 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/3.log" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.763475 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:25:35 crc kubenswrapper[4856]: E0320 13:25:35.763880 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.776882 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.791158 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.805361 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.821681 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.839280 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.854069 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.866856 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.878666 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.901811 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.920925 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.935436 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: E0320 13:25:35.952252 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.959186 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:33Z\\\",\\\"message\\\":\\\"controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:25:33.806546 7410 services_controller.go:445] Built service openshift-operator-lifecycle-manager/packageserver-service LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 13:25:33.806553 7410 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:25:33.806559 7410 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.974537 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:35 crc kubenswrapper[4856]: I0320 13:25:35.988979 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.010394 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.023409 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.038593 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.060499 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.076419 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.094884 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.115355 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.132801 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.145727 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.163983 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.182034 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.193960 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.208485 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.228035 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.244776 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.261642 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.278105 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.294194 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.307990 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.339083 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.354776 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.370233 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.395886 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:33Z\\\",\\\"message\\\":\\\"controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:25:33.806546 7410 services_controller.go:445] Built service openshift-operator-lifecycle-manager/packageserver-service LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 13:25:33.806553 7410 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:25:33.806559 7410 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.407804 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.819738 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.819797 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.819765 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:36 crc kubenswrapper[4856]: I0320 13:25:36.819738 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:36 crc kubenswrapper[4856]: E0320 13:25:36.819956 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:36 crc kubenswrapper[4856]: E0320 13:25:36.820042 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:36 crc kubenswrapper[4856]: E0320 13:25:36.820145 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:36 crc kubenswrapper[4856]: E0320 13:25:36.820228 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:38 crc kubenswrapper[4856]: I0320 13:25:38.819171 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:38 crc kubenswrapper[4856]: E0320 13:25:38.819342 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:38 crc kubenswrapper[4856]: I0320 13:25:38.819344 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:38 crc kubenswrapper[4856]: E0320 13:25:38.819437 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:38 crc kubenswrapper[4856]: I0320 13:25:38.819515 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:38 crc kubenswrapper[4856]: I0320 13:25:38.819541 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:38 crc kubenswrapper[4856]: E0320 13:25:38.819791 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:38 crc kubenswrapper[4856]: E0320 13:25:38.820071 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:40 crc kubenswrapper[4856]: I0320 13:25:40.819306 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:40 crc kubenswrapper[4856]: I0320 13:25:40.819333 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:40 crc kubenswrapper[4856]: I0320 13:25:40.819375 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:40 crc kubenswrapper[4856]: I0320 13:25:40.819452 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:40 crc kubenswrapper[4856]: E0320 13:25:40.819558 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:40 crc kubenswrapper[4856]: E0320 13:25:40.819756 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:40 crc kubenswrapper[4856]: E0320 13:25:40.819901 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:40 crc kubenswrapper[4856]: E0320 13:25:40.820004 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:40 crc kubenswrapper[4856]: E0320 13:25:40.953662 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:42 crc kubenswrapper[4856]: I0320 13:25:42.819500 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:42 crc kubenswrapper[4856]: I0320 13:25:42.819568 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:42 crc kubenswrapper[4856]: I0320 13:25:42.819637 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:42 crc kubenswrapper[4856]: I0320 13:25:42.819699 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:42 crc kubenswrapper[4856]: E0320 13:25:42.819896 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:42 crc kubenswrapper[4856]: E0320 13:25:42.820119 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:42 crc kubenswrapper[4856]: E0320 13:25:42.820369 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:42 crc kubenswrapper[4856]: E0320 13:25:42.820614 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.165040 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.165084 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.165126 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.165166 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.165178 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.182430 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.186663 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.186795 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.186817 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.186839 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.186855 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.204087 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.208425 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.208455 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.208466 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.208480 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.208513 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.220497 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.224511 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.224550 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.224564 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.224582 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.224597 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.240091 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.244245 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.244349 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.244379 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.244411 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:43 crc kubenswrapper[4856]: I0320 13:25:43.244436 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:43Z","lastTransitionTime":"2026-03-20T13:25:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.258317 4856 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1419ae1d-c71a-4a60-94aa-534c0802748c\\\",\\\"systemUUID\\\":\\\"2699e498-35c4-4151-a88c-336d77e1ef57\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:43Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:43 crc kubenswrapper[4856]: E0320 13:25:43.258516 4856 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:25:44 crc kubenswrapper[4856]: I0320 13:25:44.819202 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:44 crc kubenswrapper[4856]: I0320 13:25:44.819313 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:44 crc kubenswrapper[4856]: I0320 13:25:44.819319 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:44 crc kubenswrapper[4856]: E0320 13:25:44.819426 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:44 crc kubenswrapper[4856]: I0320 13:25:44.819236 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:44 crc kubenswrapper[4856]: E0320 13:25:44.819610 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:44 crc kubenswrapper[4856]: E0320 13:25:44.819732 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:44 crc kubenswrapper[4856]: E0320 13:25:44.819827 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.837321 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d87beb25-0f43-415f-9ff6-6c18574a49cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a45a2ba91b83764d576c51fb61d2ae44f3e8546d59f42f93d609895df001d76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d03b9424f81815c16cf532235335dc8bd297826ddb14e633e757cadac7cbe801\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.871738 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"731ac077-61af-4cfd-b5e8-c5f0e0dbb9b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c2d4624734e5fd9a4315d0054b150606629cc806efa0b2691225e8e8259056d2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455516e2e0edf638643836ef412f8e9463ff29961e9fe582e3cc0d477bfc975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64a0df3cfe53460841eacc97a2056c81275f4124877bedd6e1aee2619fdf032f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c102fbac530cf0d9e139a1d4d135c4364c940fd8ead510a480d35750d2f094d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a4a076723d28ef0f8b863fc09db532d29c1d77172ca3039c51c0c13bbfb531b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c28b9d5f7a0502d2f8fbabc7fc56032240ff92f77c7e471538aa0adf722efda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b05942c380b9e3e49458ea97e9ce92317801c1f27b8ebc2cfd5456dad96dc9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://16f94fbc7cce6978abecd305a55c9f0e7dff5e44db4f5fb69a3371ef856475cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.894236 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbf4a95d-4868-45a5-b740-b2681c6643d6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:24:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:24:10.570339 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:24:10.570975 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2744433657/tls.crt::/tmp/serving-cert-2744433657/tls.key\\\\\\\"\\\\nI0320 13:24:10.794194 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:24:10.799463 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:24:10.799492 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:24:10.799527 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:24:10.799542 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:24:10.808418 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 13:24:10.808442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808448 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:24:10.808454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:24:10.808458 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:24:10.808463 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:24:10.808467 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 13:24:10.808504 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI0320 13:24:10.810751 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\nF0320 13:24:10.810784 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.910554 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-chwcj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"da4c21dd-2600-4141-bf05-7c18c1932a33\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:25:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:26Z\\\",\\\"message\\\":\\\"2026-03-20T13:24:41+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118\\\\n2026-03-20T13:24:41+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_438b87ee-3352-478d-9315-90e4a8626118 to /host/opt/cni/bin/\\\\n2026-03-20T13:24:41Z [verbose] multus-daemon started\\\\n2026-03-20T13:24:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:25:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:25:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-svjh5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-chwcj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.925641 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-f9hch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfc66d0-d1a1-43e7-809d-a36f9bb1750c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48717b40412bf0af94b621c24b9799bffc6038b963b5e1ad95bc63b93fdf3604\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8j6n9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-f9hch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.940340 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: E0320 13:25:45.955428 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.967553 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24a5ae28-8378-4545-af2d-cf1eb86364a2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:25:33Z\\\",\\\"message\\\":\\\"controller.go:445] Built service openshift-operator-lifecycle-manager/olm-operator-metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:25:33.806546 7410 services_controller.go:445] Built service openshift-operator-lifecycle-manager/packageserver-service LB template configs for network=default: []services.lbConfig(nil)\\\\nF0320 13:25:33.806553 7410 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:33Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:25:33.806559 7410 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=def\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:25:32Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qzmgq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-9njpz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.981928 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2b79d345-85f7-40ac-beef-d6d9083fc195\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8160eacaf6966f106c8ee0be92a64bdfd2b4e6e73450b5d1258000b5c0841e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3fcae669bbee703b86ecfcbc465ad91760cd99a27a12ebc700895747630981ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tt2m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-n6c88\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:45 crc kubenswrapper[4856]: I0320 13:25:45.996141 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8e09303-3920-4304-9af9-51f5101e66db\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ced4b8c013e957abf2d0b17d88b6c876353e939922e479fde5495ad52740babd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52939c4aa21558aaea14d28997fd6d0fbcf1fc6ab1c04aea1bd503b3cd8a43ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c2ceecdfd5969127bf02a6f59d3fd9cb988de186910a539bbdc4a4dc08a0540\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dff5575d76b04151a42def18de99661006e82754cf885d3602bb8eff6c2b9d1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:23:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.017973 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6385447bd314e36218dc317e7f2599e3bab4f4e0f9d2ea55330ada7b7b45cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.033058 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6fd370b0df4668bbc82fcc7d5c32aa8c33ac4b31296721dcc0f223864a2c039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bda2d24c2aee9d846aa68ce3e946d4aaf2dd746202aeb81bcaee289db7ac5850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.047248 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.063163 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-bvt29" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"24c895f6-954c-4211-8957-0d888d862cfa\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22f3afbc6a90b7c812c65f6ba727c5012b383b34aa68070f7a256e796153f270\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://258242da80bfccebea3e48364c4b20e394808aa4564eaf63b92d01dc75cde99b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7e9c1d08417c88396df3ad8393fd60b6f32a4f8e6b668ad7f0120ce817b37348\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24d193a56520e83895521b59073f243a89d9173accf08b2ec94b379ceb65c8a6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a896f9fc9ef39ffe3a85c76bc2558e94bfa07cbf2184a457201b39d4511cb7f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6f2250114b26e48affa7703880113708e8c2dc0c49c05e706f3043ad723e932\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://465a6736b9ed74e4b93c6bef3f6b6ae05ecfcceee00bfc223fa2fbb0e2bf7471\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:24:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:24:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mlb6h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-bvt29\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.078584 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46gr4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qtlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.098510 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1334e3f-c448-49f3-8087-8cd261fadf00\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7abd9e5ca4faeca3f339d28dcf64df0c22ce26e1c848d5210530f63a867340e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a2daf9017ec423108f45af822007082bb115747e863157e0d7ac3dc31658e339\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:23:38Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:23:07.761295 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:23:07.763245 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:23:07.807296 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:23:07.812709 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0320 13:23:38.026236 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:37Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://10f7d2dc7cbb8e096d003217b22eba3167502c00244a907953608f65f0f5db93\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e927e47855d481221ef151c598917c857e2e39ba7ad9e8660a032c40cf6c0c72\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:23:05Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.116129 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f36bd39024655e2ad64c27779c1a377b9faf1b8e0ad8087434618a5056bcfe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.135538 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:29Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.150992 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cb9fx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"209c28b3-bf65-4d96-b67d-531e7463f2d5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc4980766a0790fd3c4fb83acd24f271b8583b75526a6762f57cf50ddef5bb9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gjtnh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cb9fx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.167517 4856 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e51a8789-c529-4a2c-b8f1-dc31a3c06403\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:24:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://01a8d547465b6a2c49419cbae17f548d8654b02757740b002ea660cedc0b0bf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:24:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zqfcl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:24:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dhzh4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:25:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.819865 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:46 crc kubenswrapper[4856]: E0320 13:25:46.820482 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.819894 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:46 crc kubenswrapper[4856]: E0320 13:25:46.820611 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.819952 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:46 crc kubenswrapper[4856]: E0320 13:25:46.820736 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:46 crc kubenswrapper[4856]: I0320 13:25:46.819865 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:46 crc kubenswrapper[4856]: E0320 13:25:46.820846 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:48 crc kubenswrapper[4856]: I0320 13:25:48.819234 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:48 crc kubenswrapper[4856]: I0320 13:25:48.819302 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:48 crc kubenswrapper[4856]: I0320 13:25:48.819234 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:48 crc kubenswrapper[4856]: E0320 13:25:48.819795 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:48 crc kubenswrapper[4856]: E0320 13:25:48.819503 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:48 crc kubenswrapper[4856]: E0320 13:25:48.819881 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:48 crc kubenswrapper[4856]: I0320 13:25:48.820408 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:48 crc kubenswrapper[4856]: E0320 13:25:48.820564 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:49 crc kubenswrapper[4856]: I0320 13:25:49.820197 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:25:49 crc kubenswrapper[4856]: E0320 13:25:49.820502 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:25:50 crc kubenswrapper[4856]: I0320 13:25:50.819172 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:50 crc kubenswrapper[4856]: I0320 13:25:50.819471 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:50 crc kubenswrapper[4856]: I0320 13:25:50.819078 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:50 crc kubenswrapper[4856]: E0320 13:25:50.819719 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:50 crc kubenswrapper[4856]: I0320 13:25:50.819758 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:50 crc kubenswrapper[4856]: E0320 13:25:50.819888 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:50 crc kubenswrapper[4856]: E0320 13:25:50.819958 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:50 crc kubenswrapper[4856]: E0320 13:25:50.820003 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:50 crc kubenswrapper[4856]: E0320 13:25:50.956831 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:52 crc kubenswrapper[4856]: I0320 13:25:52.819408 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:52 crc kubenswrapper[4856]: E0320 13:25:52.819533 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:52 crc kubenswrapper[4856]: I0320 13:25:52.819644 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:52 crc kubenswrapper[4856]: I0320 13:25:52.819670 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:52 crc kubenswrapper[4856]: I0320 13:25:52.819723 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:52 crc kubenswrapper[4856]: E0320 13:25:52.819891 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:52 crc kubenswrapper[4856]: E0320 13:25:52.820026 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:52 crc kubenswrapper[4856]: E0320 13:25:52.820128 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.336376 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.336420 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.336431 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.336448 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.336459 4856 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:25:53Z","lastTransitionTime":"2026-03-20T13:25:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.400740 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96"] Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.401363 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.403974 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.405458 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.405720 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.408255 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.439139 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.439201 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.439244 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.439377 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.439429 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.450152 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=51.450121833 podStartE2EDuration="51.450121833s" podCreationTimestamp="2026-03-20 13:25:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.437158794 +0000 UTC m=+168.318184934" watchObservedRunningTime="2026-03-20 13:25:53.450121833 +0000 UTC m=+168.331147993" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.479733 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cb9fx" podStartSLOduration=119.479699733 podStartE2EDuration="1m59.479699733s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.478776648 +0000 UTC m=+168.359802818" watchObservedRunningTime="2026-03-20 13:25:53.479699733 +0000 UTC m=+168.360725913" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.513532 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=78.51350274 podStartE2EDuration="1m18.51350274s" podCreationTimestamp="2026-03-20 13:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.513373356 +0000 UTC m=+168.394399546" watchObservedRunningTime="2026-03-20 13:25:53.51350274 +0000 UTC m=+168.394528930" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.513805 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podStartSLOduration=119.513793748 podStartE2EDuration="1m59.513793748s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.497757343 +0000 UTC m=+168.378783513" watchObservedRunningTime="2026-03-20 13:25:53.513793748 +0000 UTC m=+168.394819918" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540098 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540199 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540306 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540346 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540353 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540386 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.540424 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.542324 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-service-ca\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.549248 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.575425 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/734fbb6a-fd4c-4275-977d-4dfaeb6aa728-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-dfs96\" (UID: \"734fbb6a-fd4c-4275-977d-4dfaeb6aa728\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.600439 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=82.600421538 podStartE2EDuration="1m22.600421538s" podCreationTimestamp="2026-03-20 13:24:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.600047637 +0000 UTC m=+168.481073767" watchObservedRunningTime="2026-03-20 13:25:53.600421538 +0000 UTC m=+168.481447678" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.619980 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=73.61996277 podStartE2EDuration="1m13.61996277s" podCreationTimestamp="2026-03-20 13:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.619600439 +0000 UTC m=+168.500626579" watchObservedRunningTime="2026-03-20 13:25:53.61996277 +0000 UTC m=+168.500988900" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.640435 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-chwcj" podStartSLOduration=119.640417706 podStartE2EDuration="1m59.640417706s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.63261105 +0000 UTC m=+168.513637180" watchObservedRunningTime="2026-03-20 13:25:53.640417706 +0000 UTC m=+168.521443836" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.640981 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-f9hch" podStartSLOduration=119.640975812 podStartE2EDuration="1m59.640975812s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.640667883 +0000 UTC m=+168.521694023" watchObservedRunningTime="2026-03-20 13:25:53.640975812 +0000 UTC m=+168.522001942" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.697692 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-n6c88" podStartSLOduration=119.697668472 podStartE2EDuration="1m59.697668472s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.68279846 +0000 UTC m=+168.563824630" watchObservedRunningTime="2026-03-20 13:25:53.697668472 +0000 UTC m=+168.578694642" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.721469 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.726039 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.726026568 podStartE2EDuration="40.726026568s" podCreationTimestamp="2026-03-20 13:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.712988597 +0000 UTC m=+168.594014727" watchObservedRunningTime="2026-03-20 13:25:53.726026568 +0000 UTC m=+168.607052698" Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.854056 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" event={"ID":"734fbb6a-fd4c-4275-977d-4dfaeb6aa728","Type":"ContainerStarted","Data":"7780e974b7caf24c4838911acc5a27d109b13e4de589411f025d6b84427aec62"} Mar 20 13:25:53 crc kubenswrapper[4856]: I0320 13:25:53.884736 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bvt29" podStartSLOduration=119.884715175 podStartE2EDuration="1m59.884715175s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:53.883651266 +0000 UTC m=+168.764677446" watchObservedRunningTime="2026-03-20 13:25:53.884715175 +0000 UTC m=+168.765741315" Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.206879 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.219371 4856 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.818976 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.819022 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.819060 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:54 crc kubenswrapper[4856]: E0320 13:25:54.819131 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:54 crc kubenswrapper[4856]: E0320 13:25:54.819220 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:54 crc kubenswrapper[4856]: E0320 13:25:54.819317 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.819979 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:54 crc kubenswrapper[4856]: E0320 13:25:54.820259 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.860147 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" event={"ID":"734fbb6a-fd4c-4275-977d-4dfaeb6aa728","Type":"ContainerStarted","Data":"d28ba5f3b39314adccf43a98ed7a910fb52527b2c068c1e79812121d47222576"} Mar 20 13:25:54 crc kubenswrapper[4856]: I0320 13:25:54.882796 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-dfs96" podStartSLOduration=120.882767208 podStartE2EDuration="2m0.882767208s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:54.881618707 +0000 UTC m=+169.762644867" watchObservedRunningTime="2026-03-20 13:25:54.882767208 +0000 UTC m=+169.763793378" Mar 20 13:25:55 crc kubenswrapper[4856]: E0320 13:25:55.957518 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:25:56 crc kubenswrapper[4856]: I0320 13:25:56.819743 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:56 crc kubenswrapper[4856]: I0320 13:25:56.819833 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:56 crc kubenswrapper[4856]: E0320 13:25:56.819876 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:56 crc kubenswrapper[4856]: I0320 13:25:56.819954 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:56 crc kubenswrapper[4856]: E0320 13:25:56.820143 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:56 crc kubenswrapper[4856]: E0320 13:25:56.820304 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:56 crc kubenswrapper[4856]: I0320 13:25:56.820393 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:56 crc kubenswrapper[4856]: E0320 13:25:56.820466 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:25:57 crc kubenswrapper[4856]: I0320 13:25:57.486764 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:57 crc kubenswrapper[4856]: E0320 13:25:57.487003 4856 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:57 crc kubenswrapper[4856]: E0320 13:25:57.487648 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs podName:2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca nodeName:}" failed. No retries permitted until 2026-03-20 13:27:01.487613401 +0000 UTC m=+236.368639561 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs") pod "network-metrics-daemon-qtlvp" (UID: "2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:25:58 crc kubenswrapper[4856]: I0320 13:25:58.818791 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:25:58 crc kubenswrapper[4856]: I0320 13:25:58.818870 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:25:58 crc kubenswrapper[4856]: E0320 13:25:58.818992 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:25:58 crc kubenswrapper[4856]: I0320 13:25:58.819013 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:25:58 crc kubenswrapper[4856]: I0320 13:25:58.819047 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:58 crc kubenswrapper[4856]: E0320 13:25:58.819142 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:25:58 crc kubenswrapper[4856]: E0320 13:25:58.819735 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:25:58 crc kubenswrapper[4856]: E0320 13:25:58.819914 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:00 crc kubenswrapper[4856]: I0320 13:26:00.819208 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:00 crc kubenswrapper[4856]: I0320 13:26:00.819255 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:00 crc kubenswrapper[4856]: I0320 13:26:00.819212 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:00 crc kubenswrapper[4856]: E0320 13:26:00.819409 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:00 crc kubenswrapper[4856]: I0320 13:26:00.819460 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:00 crc kubenswrapper[4856]: E0320 13:26:00.819688 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:00 crc kubenswrapper[4856]: E0320 13:26:00.820081 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:00 crc kubenswrapper[4856]: E0320 13:26:00.820214 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:00 crc kubenswrapper[4856]: E0320 13:26:00.958755 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:01 crc kubenswrapper[4856]: I0320 13:26:01.820463 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:26:01 crc kubenswrapper[4856]: E0320 13:26:01.820656 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-9njpz_openshift-ovn-kubernetes(24a5ae28-8378-4545-af2d-cf1eb86364a2)\"" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" Mar 20 13:26:02 crc kubenswrapper[4856]: I0320 13:26:02.819143 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:02 crc kubenswrapper[4856]: I0320 13:26:02.819236 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:02 crc kubenswrapper[4856]: I0320 13:26:02.819168 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:02 crc kubenswrapper[4856]: E0320 13:26:02.819380 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:02 crc kubenswrapper[4856]: I0320 13:26:02.819449 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:02 crc kubenswrapper[4856]: E0320 13:26:02.819505 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:02 crc kubenswrapper[4856]: E0320 13:26:02.819607 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:02 crc kubenswrapper[4856]: E0320 13:26:02.819731 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:04 crc kubenswrapper[4856]: I0320 13:26:04.819169 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:04 crc kubenswrapper[4856]: I0320 13:26:04.819256 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:04 crc kubenswrapper[4856]: I0320 13:26:04.819371 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:04 crc kubenswrapper[4856]: I0320 13:26:04.819256 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:04 crc kubenswrapper[4856]: E0320 13:26:04.819471 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:04 crc kubenswrapper[4856]: E0320 13:26:04.819604 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:04 crc kubenswrapper[4856]: E0320 13:26:04.819389 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:04 crc kubenswrapper[4856]: E0320 13:26:04.820127 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:05 crc kubenswrapper[4856]: E0320 13:26:05.959627 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:06 crc kubenswrapper[4856]: I0320 13:26:06.818716 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:06 crc kubenswrapper[4856]: I0320 13:26:06.818802 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:06 crc kubenswrapper[4856]: I0320 13:26:06.818871 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:06 crc kubenswrapper[4856]: E0320 13:26:06.818987 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:06 crc kubenswrapper[4856]: I0320 13:26:06.819009 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:06 crc kubenswrapper[4856]: E0320 13:26:06.819139 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:06 crc kubenswrapper[4856]: E0320 13:26:06.819236 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:06 crc kubenswrapper[4856]: E0320 13:26:06.819325 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:08 crc kubenswrapper[4856]: I0320 13:26:08.819777 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:08 crc kubenswrapper[4856]: I0320 13:26:08.819793 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:08 crc kubenswrapper[4856]: E0320 13:26:08.819970 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:08 crc kubenswrapper[4856]: E0320 13:26:08.820061 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:08 crc kubenswrapper[4856]: I0320 13:26:08.819805 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:08 crc kubenswrapper[4856]: I0320 13:26:08.820436 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:08 crc kubenswrapper[4856]: E0320 13:26:08.820474 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:08 crc kubenswrapper[4856]: E0320 13:26:08.820706 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:10 crc kubenswrapper[4856]: I0320 13:26:10.819476 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:10 crc kubenswrapper[4856]: I0320 13:26:10.819515 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:10 crc kubenswrapper[4856]: I0320 13:26:10.819505 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:10 crc kubenswrapper[4856]: E0320 13:26:10.819703 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:10 crc kubenswrapper[4856]: E0320 13:26:10.819803 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:10 crc kubenswrapper[4856]: E0320 13:26:10.819997 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:10 crc kubenswrapper[4856]: I0320 13:26:10.820111 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:10 crc kubenswrapper[4856]: E0320 13:26:10.820253 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:10 crc kubenswrapper[4856]: E0320 13:26:10.961633 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.819813 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.819847 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:12 crc kubenswrapper[4856]: E0320 13:26:12.820346 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.819885 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.819849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:12 crc kubenswrapper[4856]: E0320 13:26:12.820491 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:12 crc kubenswrapper[4856]: E0320 13:26:12.820587 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:12 crc kubenswrapper[4856]: E0320 13:26:12.820673 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.932696 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/1.log" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.933621 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/0.log" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.933718 4856 generic.go:334] "Generic (PLEG): container finished" podID="da4c21dd-2600-4141-bf05-7c18c1932a33" containerID="2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608" exitCode=1 Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.933780 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerDied","Data":"2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608"} Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.933848 4856 scope.go:117] "RemoveContainer" containerID="de14ff0a74852f26bafc745d6f0c2919863571321c4d13c06bf87714bdf5e641" Mar 20 13:26:12 crc kubenswrapper[4856]: I0320 13:26:12.934393 4856 scope.go:117] "RemoveContainer" containerID="2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608" Mar 20 13:26:12 crc kubenswrapper[4856]: E0320 13:26:12.934626 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-chwcj_openshift-multus(da4c21dd-2600-4141-bf05-7c18c1932a33)\"" pod="openshift-multus/multus-chwcj" podUID="da4c21dd-2600-4141-bf05-7c18c1932a33" Mar 20 13:26:13 crc kubenswrapper[4856]: I0320 13:26:13.820928 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:26:13 crc kubenswrapper[4856]: I0320 13:26:13.940307 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/1.log" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.819580 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:14 crc kubenswrapper[4856]: E0320 13:26:14.820067 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.819633 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.819720 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.819659 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:14 crc kubenswrapper[4856]: E0320 13:26:14.820390 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:14 crc kubenswrapper[4856]: E0320 13:26:14.820430 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:14 crc kubenswrapper[4856]: E0320 13:26:14.820506 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.896892 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qtlvp"] Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.947685 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/3.log" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.951767 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:14 crc kubenswrapper[4856]: E0320 13:26:14.951964 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.952265 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerStarted","Data":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:26:14 crc kubenswrapper[4856]: I0320 13:26:14.953097 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:26:15 crc kubenswrapper[4856]: E0320 13:26:15.962321 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:16 crc kubenswrapper[4856]: I0320 13:26:16.819396 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:16 crc kubenswrapper[4856]: I0320 13:26:16.819541 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:16 crc kubenswrapper[4856]: I0320 13:26:16.819443 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:16 crc kubenswrapper[4856]: E0320 13:26:16.819620 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:16 crc kubenswrapper[4856]: I0320 13:26:16.819443 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:16 crc kubenswrapper[4856]: E0320 13:26:16.819743 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:16 crc kubenswrapper[4856]: E0320 13:26:16.819882 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:16 crc kubenswrapper[4856]: E0320 13:26:16.819992 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:18 crc kubenswrapper[4856]: I0320 13:26:18.818785 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:18 crc kubenswrapper[4856]: I0320 13:26:18.818847 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:18 crc kubenswrapper[4856]: E0320 13:26:18.818975 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:18 crc kubenswrapper[4856]: I0320 13:26:18.819035 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:18 crc kubenswrapper[4856]: I0320 13:26:18.819048 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:18 crc kubenswrapper[4856]: E0320 13:26:18.819425 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:18 crc kubenswrapper[4856]: E0320 13:26:18.819649 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:18 crc kubenswrapper[4856]: E0320 13:26:18.819741 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:20 crc kubenswrapper[4856]: I0320 13:26:20.818847 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:20 crc kubenswrapper[4856]: I0320 13:26:20.818962 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:20 crc kubenswrapper[4856]: I0320 13:26:20.818846 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:20 crc kubenswrapper[4856]: E0320 13:26:20.819072 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:20 crc kubenswrapper[4856]: I0320 13:26:20.819199 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:20 crc kubenswrapper[4856]: E0320 13:26:20.819416 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:20 crc kubenswrapper[4856]: E0320 13:26:20.819616 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:20 crc kubenswrapper[4856]: E0320 13:26:20.819799 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:20 crc kubenswrapper[4856]: E0320 13:26:20.963509 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:22 crc kubenswrapper[4856]: I0320 13:26:22.818913 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:22 crc kubenswrapper[4856]: I0320 13:26:22.818924 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:22 crc kubenswrapper[4856]: E0320 13:26:22.819566 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:22 crc kubenswrapper[4856]: I0320 13:26:22.818995 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:22 crc kubenswrapper[4856]: I0320 13:26:22.818966 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:22 crc kubenswrapper[4856]: E0320 13:26:22.819723 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:22 crc kubenswrapper[4856]: E0320 13:26:22.819861 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:22 crc kubenswrapper[4856]: E0320 13:26:22.820086 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:24 crc kubenswrapper[4856]: I0320 13:26:24.818769 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:24 crc kubenswrapper[4856]: I0320 13:26:24.818832 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:24 crc kubenswrapper[4856]: I0320 13:26:24.818873 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:24 crc kubenswrapper[4856]: E0320 13:26:24.818999 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:24 crc kubenswrapper[4856]: I0320 13:26:24.819028 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:24 crc kubenswrapper[4856]: E0320 13:26:24.819191 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:24 crc kubenswrapper[4856]: E0320 13:26:24.819365 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:24 crc kubenswrapper[4856]: E0320 13:26:24.819537 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:25 crc kubenswrapper[4856]: I0320 13:26:25.820810 4856 scope.go:117] "RemoveContainer" containerID="2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608" Mar 20 13:26:25 crc kubenswrapper[4856]: I0320 13:26:25.842595 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podStartSLOduration=151.842578315 podStartE2EDuration="2m31.842578315s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:14.978634179 +0000 UTC m=+189.859660319" watchObservedRunningTime="2026-03-20 13:26:25.842578315 +0000 UTC m=+200.723604455" Mar 20 13:26:25 crc kubenswrapper[4856]: E0320 13:26:25.963994 4856 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:26:25 crc kubenswrapper[4856]: I0320 13:26:25.992092 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/1.log" Mar 20 13:26:25 crc kubenswrapper[4856]: I0320 13:26:25.992170 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerStarted","Data":"674d23b0d62a7fc9ce60c35c93919539056a2169216e683cc45f7588b8727351"} Mar 20 13:26:26 crc kubenswrapper[4856]: I0320 13:26:26.819561 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:26 crc kubenswrapper[4856]: E0320 13:26:26.819954 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:26 crc kubenswrapper[4856]: I0320 13:26:26.819611 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:26 crc kubenswrapper[4856]: E0320 13:26:26.820036 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:26 crc kubenswrapper[4856]: I0320 13:26:26.819586 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:26 crc kubenswrapper[4856]: I0320 13:26:26.819645 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:26 crc kubenswrapper[4856]: E0320 13:26:26.820116 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:26 crc kubenswrapper[4856]: E0320 13:26:26.820168 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:28 crc kubenswrapper[4856]: I0320 13:26:28.818978 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:28 crc kubenswrapper[4856]: E0320 13:26:28.819188 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:28 crc kubenswrapper[4856]: I0320 13:26:28.819002 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:28 crc kubenswrapper[4856]: I0320 13:26:28.819420 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:28 crc kubenswrapper[4856]: I0320 13:26:28.819533 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:28 crc kubenswrapper[4856]: E0320 13:26:28.819896 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:28 crc kubenswrapper[4856]: E0320 13:26:28.820348 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:28 crc kubenswrapper[4856]: E0320 13:26:28.820499 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:30 crc kubenswrapper[4856]: I0320 13:26:30.818974 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:30 crc kubenswrapper[4856]: I0320 13:26:30.819053 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:30 crc kubenswrapper[4856]: E0320 13:26:30.819145 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:26:30 crc kubenswrapper[4856]: I0320 13:26:30.819236 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:30 crc kubenswrapper[4856]: I0320 13:26:30.819236 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:30 crc kubenswrapper[4856]: E0320 13:26:30.819400 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:26:30 crc kubenswrapper[4856]: E0320 13:26:30.819548 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qtlvp" podUID="2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca" Mar 20 13:26:30 crc kubenswrapper[4856]: E0320 13:26:30.819651 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.819372 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.819469 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.819486 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.819606 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.822636 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.822988 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.825802 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.825841 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.825906 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:26:32 crc kubenswrapper[4856]: I0320 13:26:32.826538 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.079424 4856 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.133954 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqq88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.134816 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.138057 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.141660 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.142376 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.142948 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.142988 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.143489 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.143611 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.143735 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.143882 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.143497 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.144299 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.144159 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.151377 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.152128 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvkxl"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.152782 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.153002 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.153003 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.153682 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.161079 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.161202 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.161998 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.162733 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.163096 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.166547 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.167661 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.169193 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.172764 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.178571 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179122 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179134 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179564 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179722 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179574 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179682 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.179914 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.180620 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.180999 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.181367 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.181589 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.181755 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.181372 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.182566 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.182849 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.183059 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.183196 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.183757 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.183958 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184198 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184444 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184508 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184740 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184773 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.184448 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.186117 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.191336 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvdxt"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.191933 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.197992 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.198467 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.208456 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.216839 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.208473 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.219795 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.231851 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.232114 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.232775 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.232885 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lpbh5"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.233293 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.233492 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.233564 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.233682 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.236986 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.238765 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.239093 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.241737 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242048 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242206 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242335 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242446 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242697 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242813 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.242921 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.243523 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.243640 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.243902 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.247985 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vzmd5"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.248882 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.255349 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.256209 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.257396 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.257831 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.258330 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.258567 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.258596 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.258998 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.259199 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.259645 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.268215 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.259625 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.268550 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.262321 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.262457 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.262716 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.268904 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.262746 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.262821 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.263387 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.263407 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.263744 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.263989 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.264031 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.272472 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-9pbqq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.273104 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.274036 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.274125 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.274467 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.275446 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2nch8"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.276727 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.277611 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pdv92"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.278136 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.294579 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.295440 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.297139 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.301902 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.301945 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.302922 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.303333 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvdxt"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.306928 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.308477 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.308636 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.308756 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.308970 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309067 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309145 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309223 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309397 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309186 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309633 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309731 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309787 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.309957 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.310100 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.310163 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.310823 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.312021 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vzmd5"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.312547 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.312618 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.313127 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.315727 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.315863 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.316441 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pmhrr"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.316827 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.316922 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.317206 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.317285 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.317606 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-client\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318475 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-encryption-config\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318608 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-audit-policies\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318745 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318864 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318974 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54mql\" (UniqueName: \"kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319089 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-audit\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319202 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxjs\" (UniqueName: \"kubernetes.io/projected/89f7865f-b485-4242-a58d-52252234aa99-kube-api-access-wqxjs\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.317843 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.318775 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319466 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319400 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319832 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-image-import-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.319953 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320068 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320181 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-config\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320425 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-machine-approver-tls\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320542 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320629 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-serving-cert\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320804 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.320903 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321007 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjmk8\" (UniqueName: \"kubernetes.io/projected/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-kube-api-access-jjmk8\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321116 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkpmm\" (UniqueName: \"kubernetes.io/projected/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-kube-api-access-mkpmm\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321229 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-etcd-client\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321378 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321475 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321565 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89f7865f-b485-4242-a58d-52252234aa99-audit-dir\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321653 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-encryption-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321751 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnv7k\" (UniqueName: \"kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.317767 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.322376 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wxk9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323155 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.321853 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323591 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-images\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323608 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-serving-cert\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323624 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-node-pullsecrets\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323642 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323663 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323679 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmwx5\" (UniqueName: \"kubernetes.io/projected/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-kube-api-access-pmwx5\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323696 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323715 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-serving-cert\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323734 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323749 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-config\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323763 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323777 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6d68\" (UniqueName: \"kubernetes.io/projected/a15d3bc4-898a-48c7-a076-23e0911e635e-kube-api-access-z6d68\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323795 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323810 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323827 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gplnl\" (UniqueName: \"kubernetes.io/projected/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-kube-api-access-gplnl\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323843 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-audit-dir\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.323857 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-auth-proxy-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.324393 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.324478 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.324650 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.324862 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.325119 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.325225 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.325683 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.325877 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.326224 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.326391 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.325855 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.329383 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-26dzg"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.329709 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7kv88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.329796 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.329992 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.330184 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.330306 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.333896 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.334349 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.334431 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.334391 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335104 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vkb56"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335574 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335595 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335806 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335925 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqq88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.335941 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566886-xkbwc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.336129 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.336241 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.336350 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.336480 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.337299 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.338118 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.338639 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pdv92"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.343787 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.349761 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.359924 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.361745 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.377700 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.380420 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.381631 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvkxl"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.381763 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pbqq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.381870 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.381964 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.382048 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2nch8"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.382132 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.382215 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lpbh5"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.383341 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkb56"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.388235 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.389990 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.391693 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.392914 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.395389 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7kv88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.396383 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.396567 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.397357 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.398324 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wxk9"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.399332 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fph5p"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.399879 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.400311 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.401488 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.402525 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.403409 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-xkbwc"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.404387 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.405360 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.406318 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.408299 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-26dzg"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.409322 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fph5p"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.410280 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-ttgqh"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.410777 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.411294 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7wsbq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.412262 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.413180 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7wsbq"] Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.417253 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424203 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424226 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89f7865f-b485-4242-a58d-52252234aa99-audit-dir\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424245 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-encryption-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424281 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnv7k\" (UniqueName: \"kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424307 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424321 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-images\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424335 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-serving-cert\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424351 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-node-pullsecrets\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424366 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424371 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/89f7865f-b485-4242-a58d-52252234aa99-audit-dir\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424379 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424398 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424417 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmwx5\" (UniqueName: \"kubernetes.io/projected/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-kube-api-access-pmwx5\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424434 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424451 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-serving-cert\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424456 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-node-pullsecrets\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424465 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424510 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424568 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-config\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424595 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj9m9\" (UniqueName: \"kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424616 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424670 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424691 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424736 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6d68\" (UniqueName: \"kubernetes.io/projected/a15d3bc4-898a-48c7-a076-23e0911e635e-kube-api-access-z6d68\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424756 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gplnl\" (UniqueName: \"kubernetes.io/projected/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-kube-api-access-gplnl\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424773 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-audit-dir\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424788 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-auth-proxy-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424807 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424825 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-client\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424841 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2jzn\" (UniqueName: \"kubernetes.io/projected/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-kube-api-access-j2jzn\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424857 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424875 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-audit-policies\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424890 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-encryption-config\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424915 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424932 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424948 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54mql\" (UniqueName: \"kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424963 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-audit\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.424985 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxjs\" (UniqueName: \"kubernetes.io/projected/89f7865f-b485-4242-a58d-52252234aa99-kube-api-access-wqxjs\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425001 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425017 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425032 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-image-import-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425032 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425049 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-config\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425065 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-machine-approver-tls\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425080 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425095 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425111 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-serving-cert\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425129 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-proxy-tls\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425145 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425165 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425184 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425201 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjmk8\" (UniqueName: \"kubernetes.io/projected/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-kube-api-access-jjmk8\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425217 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkpmm\" (UniqueName: \"kubernetes.io/projected/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-kube-api-access-mkpmm\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425232 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-etcd-client\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425247 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425263 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425306 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425889 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-service-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.425900 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.426079 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.426419 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-audit\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.426630 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.426881 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-images\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.427582 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.427719 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.427772 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-serving-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.428072 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.428685 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a15d3bc4-898a-48c7-a076-23e0911e635e-audit-dir\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.428689 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.429447 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.429643 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-auth-proxy-config\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.429860 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a15d3bc4-898a-48c7-a076-23e0911e635e-image-import-ca\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.429883 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-encryption-config\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.429887 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430003 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-config\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430169 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-audit-policies\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430205 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-serving-cert\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430228 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-config\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430484 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/89f7865f-b485-4242-a58d-52252234aa99-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430505 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.430952 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431416 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-serving-cert\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431435 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-etcd-client\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431705 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-machine-approver-tls\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431772 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431889 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.431938 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-etcd-client\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.433579 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a15d3bc4-898a-48c7-a076-23e0911e635e-serving-cert\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.433863 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-serving-cert\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.433921 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/89f7865f-b485-4242-a58d-52252234aa99-encryption-config\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.437311 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.456628 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.477011 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.497499 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.518012 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525706 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525742 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525800 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525823 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525846 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj9m9\" (UniqueName: \"kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525866 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525882 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2jzn\" (UniqueName: \"kubernetes.io/projected/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-kube-api-access-j2jzn\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525896 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525954 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-proxy-tls\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.525969 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.526708 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.527438 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.527498 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.528126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.528586 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.531918 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.534418 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.556630 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.559174 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.577295 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.597764 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.617329 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.637728 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.657540 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.677298 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.697874 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.717142 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.737801 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.765056 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.777519 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.817544 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.837698 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.857408 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.879361 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.899167 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.918373 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.938485 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.959156 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.978293 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:26:34 crc kubenswrapper[4856]: I0320 13:26:34.997317 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.017697 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.038670 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.058401 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.078364 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.098371 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.118132 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.138524 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.157633 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.178181 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.198627 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.217871 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.239084 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.258000 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.278351 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.297696 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.318046 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.333368 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-proxy-tls\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.335614 4856 request.go:700] Waited for 1.010605868s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-controller-dockercfg-c2lfx&limit=500&resourceVersion=0 Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.337971 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.357758 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.377889 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.397042 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.418160 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.438367 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.457970 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.491453 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.498171 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.518173 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.538421 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.557638 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.577660 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.598499 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.618883 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.637963 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.658882 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.678403 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.698948 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.719207 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.737806 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.757147 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.777697 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.797494 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.817746 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.838941 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.858134 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.898878 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.918241 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.938065 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.958878 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.978507 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:26:35 crc kubenswrapper[4856]: I0320 13:26:35.998072 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.018664 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.038231 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.059197 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.078437 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.099174 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.117254 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.137972 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.158993 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.177818 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.197895 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.217612 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.239074 4856 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.258657 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.306080 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnv7k\" (UniqueName: \"kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k\") pod \"route-controller-manager-6576b87f9c-n8vk7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.326601 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54mql\" (UniqueName: \"kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql\") pod \"controller-manager-879f6c89f-dtwhv\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.334658 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.336409 4856 request.go:700] Waited for 1.909109978s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.343528 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxjs\" (UniqueName: \"kubernetes.io/projected/89f7865f-b485-4242-a58d-52252234aa99-kube-api-access-wqxjs\") pod \"apiserver-7bbb656c7d-hqths\" (UID: \"89f7865f-b485-4242-a58d-52252234aa99\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.365405 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjmk8\" (UniqueName: \"kubernetes.io/projected/1d65a22a-bf52-43e0-a4c3-60808f60b2e5-kube-api-access-jjmk8\") pod \"openshift-config-operator-7777fb866f-2jgcz\" (UID: \"1d65a22a-bf52-43e0-a4c3-60808f60b2e5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.370900 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.393426 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmwx5\" (UniqueName: \"kubernetes.io/projected/3cfd97a6-abeb-4d68-97c3-751aaa6445f9-kube-api-access-pmwx5\") pod \"authentication-operator-69f744f599-dvdxt\" (UID: \"3cfd97a6-abeb-4d68-97c3-751aaa6445f9\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.403947 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkpmm\" (UniqueName: \"kubernetes.io/projected/36be461e-6bde-44a4-8cbe-35a5c8ef8be8-kube-api-access-mkpmm\") pod \"machine-api-operator-5694c8668f-cvkxl\" (UID: \"36be461e-6bde-44a4-8cbe-35a5c8ef8be8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.409484 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.416059 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gplnl\" (UniqueName: \"kubernetes.io/projected/2a2f9676-ad0d-44ce-939a-0c12e90b0d31-kube-api-access-gplnl\") pod \"machine-approver-56656f9798-c2hg7\" (UID: \"2a2f9676-ad0d-44ce-939a-0c12e90b0d31\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.418263 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.445763 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6d68\" (UniqueName: \"kubernetes.io/projected/a15d3bc4-898a-48c7-a076-23e0911e635e-kube-api-access-z6d68\") pod \"apiserver-76f77b778f-fqq88\" (UID: \"a15d3bc4-898a-48c7-a076-23e0911e635e\") " pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.459083 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj9m9\" (UniqueName: \"kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9\") pod \"console-f9d7485db-jwjhv\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.489407 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.490705 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2jzn\" (UniqueName: \"kubernetes.io/projected/d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5-kube-api-access-j2jzn\") pod \"machine-config-controller-84d6567774-q9k5z\" (UID: \"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550842 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-config\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550874 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550902 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550917 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8s67\" (UniqueName: \"kubernetes.io/projected/66fe265e-f557-4aff-a055-bb9d0ca82215-kube-api-access-n8s67\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550931 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlx8m\" (UniqueName: \"kubernetes.io/projected/7d3c0cec-4c3b-4af7-880f-9517c8233886-kube-api-access-vlx8m\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550947 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c645067e-915e-4021-b388-de1c159d99da-trusted-ca\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.550973 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee423424-e9c4-489d-a372-3b6a7dff66bd-metrics-tls\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551009 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551025 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-config\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551039 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1adf702-1031-43c9-be15-8297a51e6958-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551053 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d3c0cec-4c3b-4af7-880f-9517c8233886-config\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551068 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6117aa-1e35-4102-8181-6d6e370c4bee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551083 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551100 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551116 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc2fe95-1309-4007-8102-ba43375cf22b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551140 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2tnx\" (UniqueName: \"kubernetes.io/projected/f1adf702-1031-43c9-be15-8297a51e6958-kube-api-access-z2tnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551491 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c071e4-4f55-46c4-944f-05ba67dec8dd-service-ca-bundle\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551536 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-trusted-ca\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551574 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28nm\" (UniqueName: \"kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551596 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551614 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-srv-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551628 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1adf702-1031-43c9-be15-8297a51e6958-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551643 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551872 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6117aa-1e35-4102-8181-6d6e370c4bee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551893 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19e2de31-96c2-4e95-a83c-f086710a9bc0-proxy-tls\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551934 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/072648eb-9536-49d4-a7a8-411ee27e377b-signing-cabundle\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551968 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.551986 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4d7\" (UniqueName: \"kubernetes.io/projected/b63b297b-9f19-4062-b26a-5d1888e1280e-kube-api-access-qx4d7\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552001 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km9t4\" (UniqueName: \"kubernetes.io/projected/ee423424-e9c4-489d-a372-3b6a7dff66bd-kube-api-access-km9t4\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552117 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5bmj\" (UniqueName: \"kubernetes.io/projected/de948299-5822-4c15-b312-ffc6b83f6cc9-kube-api-access-q5bmj\") pod \"migrator-59844c95c7-b8ltz\" (UID: \"de948299-5822-4c15-b312-ffc6b83f6cc9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552139 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6117aa-1e35-4102-8181-6d6e370c4bee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552154 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-config\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccnm9\" (UniqueName: \"kubernetes.io/projected/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-kube-api-access-ccnm9\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552197 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c645067e-915e-4021-b388-de1c159d99da-metrics-tls\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552245 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vn4n\" (UniqueName: \"kubernetes.io/projected/c078f2c5-1010-4c6e-852a-65b6d94dfa16-kube-api-access-9vn4n\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552293 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwl8h\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-kube-api-access-xwl8h\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552319 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0178be03-ba26-4fb0-88b7-853a34780442-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552342 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552387 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63b297b-9f19-4062-b26a-5d1888e1280e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552402 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc2fe95-1309-4007-8102-ba43375cf22b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552419 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3ba6ea-9193-4ab8-a6b3-938f4069334a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552445 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552459 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk5sd\" (UniqueName: \"kubernetes.io/projected/b5cb6a22-572a-47fb-978b-b091ab19f2d6-kube-api-access-zk5sd\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.552992 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-metrics-certs\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553117 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553153 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6qbh\" (UniqueName: \"kubernetes.io/projected/0178be03-ba26-4fb0-88b7-853a34780442-kube-api-access-g6qbh\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553205 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628df06a-2257-49f7-9d72-5aa490049230-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553288 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-images\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553326 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8v74\" (UniqueName: \"kubernetes.io/projected/8c4e197a-deea-42d4-a7b1-0d83cc546b76-kube-api-access-p8v74\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553363 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553382 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553399 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553420 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zjb\" (UniqueName: \"kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553444 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553476 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdwvd\" (UniqueName: \"kubernetes.io/projected/f8c071e4-4f55-46c4-944f-05ba67dec8dd-kube-api-access-qdwvd\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553507 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbvr\" (UniqueName: \"kubernetes.io/projected/ae4a83e0-1d82-4893-af26-ff6f5741b9a0-kube-api-access-xwbvr\") pod \"downloads-7954f5f757-9pbqq\" (UID: \"ae4a83e0-1d82-4893-af26-ff6f5741b9a0\") " pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553553 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63b297b-9f19-4062-b26a-5d1888e1280e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553574 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlcc\" (UniqueName: \"kubernetes.io/projected/072648eb-9536-49d4-a7a8-411ee27e377b-kube-api-access-xjlcc\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553596 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-config\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553616 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553649 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gvn\" (UniqueName: \"kubernetes.io/projected/5dc2fe95-1309-4007-8102-ba43375cf22b-kube-api-access-n9gvn\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553669 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-client\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553688 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rf7jq\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-kube-api-access-rf7jq\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553709 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553729 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553762 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3ba6ea-9193-4ab8-a6b3-938f4069334a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553798 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553819 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553867 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553888 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553931 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553947 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-profile-collector-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.553982 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-serving-cert\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554003 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9tk\" (UniqueName: \"kubernetes.io/projected/19e2de31-96c2-4e95-a83c-f086710a9bc0-kube-api-access-qv9tk\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554017 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lps8f\" (UniqueName: \"kubernetes.io/projected/91878935-9c8c-4927-bafe-16719ddb8461-kube-api-access-lps8f\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554038 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554052 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-service-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554066 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554083 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c078f2c5-1010-4c6e-852a-65b6d94dfa16-serving-cert\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554097 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5cb6a22-572a-47fb-978b-b091ab19f2d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554113 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-srv-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554159 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554184 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d3c0cec-4c3b-4af7-880f-9517c8233886-serving-cert\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554201 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c4e197a-deea-42d4-a7b1-0d83cc546b76-tmpfs\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554219 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpwbd\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554235 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554258 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-stats-auth\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554334 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554354 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554373 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/072648eb-9536-49d4-a7a8-411ee27e377b-signing-key\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554391 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554411 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-default-certificate\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554446 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554478 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-webhook-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.554501 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2jfr\" (UniqueName: \"kubernetes.io/projected/628df06a-2257-49f7-9d72-5aa490049230-kube-api-access-z2jfr\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.555285 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.055251609 +0000 UTC m=+211.936277739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.586502 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.595956 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.602772 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.654990 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.655101 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.155075669 +0000 UTC m=+212.036101799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655144 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xglz7\" (UniqueName: \"kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655167 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-certs\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655194 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655214 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x66f\" (UniqueName: \"kubernetes.io/projected/e5460cbb-96ab-496c-84cf-e85c25f68fcc-kube-api-access-9x66f\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655229 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d3c0cec-4c3b-4af7-880f-9517c8233886-serving-cert\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655248 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c4e197a-deea-42d4-a7b1-0d83cc546b76-tmpfs\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655291 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655295 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpwbd\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655759 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655779 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-stats-auth\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655796 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655811 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655832 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bv5l\" (UniqueName: \"kubernetes.io/projected/0086dcd3-2759-447b-907e-926a36e7a25d-kube-api-access-7bv5l\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655860 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/072648eb-9536-49d4-a7a8-411ee27e377b-signing-key\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655879 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa44c47f-e650-4056-9588-51fd98a96b99-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655909 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655925 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-default-certificate\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655943 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655961 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-webhook-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655977 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfnk9\" (UniqueName: \"kubernetes.io/projected/fa44c47f-e650-4056-9588-51fd98a96b99-kube-api-access-bfnk9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.655994 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2jfr\" (UniqueName: \"kubernetes.io/projected/628df06a-2257-49f7-9d72-5aa490049230-kube-api-access-z2jfr\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656011 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-config\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656031 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656047 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8s67\" (UniqueName: \"kubernetes.io/projected/66fe265e-f557-4aff-a055-bb9d0ca82215-kube-api-access-n8s67\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656063 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlx8m\" (UniqueName: \"kubernetes.io/projected/7d3c0cec-4c3b-4af7-880f-9517c8233886-kube-api-access-vlx8m\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656079 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c645067e-915e-4021-b388-de1c159d99da-trusted-ca\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656094 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656112 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db9nr\" (UniqueName: \"kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr\") pod \"auto-csr-approver-29566886-xkbwc\" (UID: \"46f98403-30f8-40f6-afa6-6defe5937024\") " pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656111 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-auth-proxy-config\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656127 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656155 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee423424-e9c4-489d-a372-3b6a7dff66bd-metrics-tls\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656172 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656189 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-config\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656217 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1adf702-1031-43c9-be15-8297a51e6958-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656235 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d3c0cec-4c3b-4af7-880f-9517c8233886-config\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656250 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5460cbb-96ab-496c-84cf-e85c25f68fcc-config-volume\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656286 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6117aa-1e35-4102-8181-6d6e370c4bee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656302 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-node-bootstrap-token\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659692 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-csi-data-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659741 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659791 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659818 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc2fe95-1309-4007-8102-ba43375cf22b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659904 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2tnx\" (UniqueName: \"kubernetes.io/projected/f1adf702-1031-43c9-be15-8297a51e6958-kube-api-access-z2tnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659965 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c071e4-4f55-46c4-944f-05ba67dec8dd-service-ca-bundle\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d3c0cec-4c3b-4af7-880f-9517c8233886-config\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659997 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-mountpoint-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660065 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-trusted-ca\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660089 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-plugins-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660114 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660132 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-srv-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660150 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1adf702-1031-43c9-be15-8297a51e6958-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660168 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660185 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b28nm\" (UniqueName: \"kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660205 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660227 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6117aa-1e35-4102-8181-6d6e370c4bee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660245 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19e2de31-96c2-4e95-a83c-f086710a9bc0-proxy-tls\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660339 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-registration-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660362 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660381 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4d7\" (UniqueName: \"kubernetes.io/projected/b63b297b-9f19-4062-b26a-5d1888e1280e-kube-api-access-qx4d7\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660400 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km9t4\" (UniqueName: \"kubernetes.io/projected/ee423424-e9c4-489d-a372-3b6a7dff66bd-kube-api-access-km9t4\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660417 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/072648eb-9536-49d4-a7a8-411ee27e377b-signing-cabundle\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660448 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5bmj\" (UniqueName: \"kubernetes.io/projected/de948299-5822-4c15-b312-ffc6b83f6cc9-kube-api-access-q5bmj\") pod \"migrator-59844c95c7-b8ltz\" (UID: \"de948299-5822-4c15-b312-ffc6b83f6cc9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660467 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6117aa-1e35-4102-8181-6d6e370c4bee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660486 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-config\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660502 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5460cbb-96ab-496c-84cf-e85c25f68fcc-metrics-tls\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660539 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccnm9\" (UniqueName: \"kubernetes.io/projected/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-kube-api-access-ccnm9\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660555 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c645067e-915e-4021-b388-de1c159d99da-metrics-tls\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660571 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vn4n\" (UniqueName: \"kubernetes.io/projected/c078f2c5-1010-4c6e-852a-65b6d94dfa16-kube-api-access-9vn4n\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660588 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwl8h\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-kube-api-access-xwl8h\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660622 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0178be03-ba26-4fb0-88b7-853a34780442-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660642 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660657 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63b297b-9f19-4062-b26a-5d1888e1280e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660677 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc2fe95-1309-4007-8102-ba43375cf22b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660695 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3ba6ea-9193-4ab8-a6b3-938f4069334a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660729 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660744 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk5sd\" (UniqueName: \"kubernetes.io/projected/b5cb6a22-572a-47fb-978b-b091ab19f2d6-kube-api-access-zk5sd\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660773 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-metrics-certs\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660801 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660817 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6qbh\" (UniqueName: \"kubernetes.io/projected/0178be03-ba26-4fb0-88b7-853a34780442-kube-api-access-g6qbh\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660838 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628df06a-2257-49f7-9d72-5aa490049230-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660855 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-images\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660871 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660889 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8v74\" (UniqueName: \"kubernetes.io/projected/8c4e197a-deea-42d4-a7b1-0d83cc546b76-kube-api-access-p8v74\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660906 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-socket-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660924 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660941 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660958 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660979 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.660996 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9zjb\" (UniqueName: \"kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661017 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbvr\" (UniqueName: \"kubernetes.io/projected/ae4a83e0-1d82-4893-af26-ff6f5741b9a0-kube-api-access-xwbvr\") pod \"downloads-7954f5f757-9pbqq\" (UID: \"ae4a83e0-1d82-4893-af26-ff6f5741b9a0\") " pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661033 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdwvd\" (UniqueName: \"kubernetes.io/projected/f8c071e4-4f55-46c4-944f-05ba67dec8dd-kube-api-access-qdwvd\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661052 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63b297b-9f19-4062-b26a-5d1888e1280e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661067 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlcc\" (UniqueName: \"kubernetes.io/projected/072648eb-9536-49d4-a7a8-411ee27e377b-kube-api-access-xjlcc\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661087 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-config\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661104 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661123 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gvn\" (UniqueName: \"kubernetes.io/projected/5dc2fe95-1309-4007-8102-ba43375cf22b-kube-api-access-n9gvn\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661119 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-stats-auth\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661140 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rf7jq\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-kube-api-access-rf7jq\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661157 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-client\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661195 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661212 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l646x\" (UniqueName: \"kubernetes.io/projected/df1ec349-a277-489a-bc19-5644739e80a1-kube-api-access-l646x\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661232 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661251 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3ba6ea-9193-4ab8-a6b3-938f4069334a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661296 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661314 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661332 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx475\" (UniqueName: \"kubernetes.io/projected/7a3728e4-9ce3-4546-9eab-e0b4410532ca-kube-api-access-jx475\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661354 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661369 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661386 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661401 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-profile-collector-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661428 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-serving-cert\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661445 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9tk\" (UniqueName: \"kubernetes.io/projected/19e2de31-96c2-4e95-a83c-f086710a9bc0-kube-api-access-qv9tk\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661462 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661480 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-service-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661496 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661512 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df1ec349-a277-489a-bc19-5644739e80a1-cert\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661529 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lps8f\" (UniqueName: \"kubernetes.io/projected/91878935-9c8c-4927-bafe-16719ddb8461-kube-api-access-lps8f\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661555 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c078f2c5-1010-4c6e-852a-65b6d94dfa16-serving-cert\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661570 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5cb6a22-572a-47fb-978b-b091ab19f2d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661587 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-srv-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.661693 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee423424-e9c4-489d-a372-3b6a7dff66bd-metrics-tls\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.657521 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.662592 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-profile-collector-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.662834 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-config\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.663126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d3c0cec-4c3b-4af7-880f-9517c8233886-serving-cert\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.663685 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.664203 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.665995 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.666903 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/da3ba6ea-9193-4ab8-a6b3-938f4069334a-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.667172 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.167153063 +0000 UTC m=+212.048179283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.669430 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.670164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.670634 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/072648eb-9536-49d4-a7a8-411ee27e377b-signing-key\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.670637 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-webhook-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.670828 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-trusted-ca\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.657124 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-config\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.656705 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8c4e197a-deea-42d4-a7b1-0d83cc546b76-tmpfs\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.658606 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.657161 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.658566 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1adf702-1031-43c9-be15-8297a51e6958-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.672213 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.659308 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c645067e-915e-4021-b388-de1c159d99da-trusted-ca\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.672798 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-default-certificate\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.674185 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-srv-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.674570 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.675115 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/628df06a-2257-49f7-9d72-5aa490049230-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.675651 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63b297b-9f19-4062-b26a-5d1888e1280e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.675844 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/91878935-9c8c-4927-bafe-16719ddb8461-srv-cert\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.676721 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/072648eb-9536-49d4-a7a8-411ee27e377b-signing-cabundle\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.677000 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c071e4-4f55-46c4-944f-05ba67dec8dd-service-ca-bundle\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.677003 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c078f2c5-1010-4c6e-852a-65b6d94dfa16-config\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.677154 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9e6117aa-1e35-4102-8181-6d6e370c4bee-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.677362 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.678676 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-config\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.678746 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.678939 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5dc2fe95-1309-4007-8102-ba43375cf22b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.679449 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8c4e197a-deea-42d4-a7b1-0d83cc546b76-apiservice-cert\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.680020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.680332 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9e6117aa-1e35-4102-8181-6d6e370c4bee-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.681297 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.683951 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-service-ca\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.684817 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.685482 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-etcd-client\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.685942 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/b5cb6a22-572a-47fb-978b-b091ab19f2d6-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.687198 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.687522 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.687931 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8c071e4-4f55-46c4-944f-05ba67dec8dd-metrics-certs\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.687947 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.688488 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c645067e-915e-4021-b388-de1c159d99da-metrics-tls\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.688987 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b63b297b-9f19-4062-b26a-5d1888e1280e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.691136 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c078f2c5-1010-4c6e-852a-65b6d94dfa16-serving-cert\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.692688 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5dc2fe95-1309-4007-8102-ba43375cf22b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.693609 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.693790 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpwbd\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.694070 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0178be03-ba26-4fb0-88b7-853a34780442-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.696395 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.704490 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/66fe265e-f557-4aff-a055-bb9d0ca82215-profile-collector-cert\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.704491 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/19e2de31-96c2-4e95-a83c-f086710a9bc0-images\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.704933 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/19e2de31-96c2-4e95-a83c-f086710a9bc0-proxy-tls\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.705034 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1adf702-1031-43c9-be15-8297a51e6958-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.705330 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.706294 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.706359 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-serving-cert\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.707656 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.707907 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.710353 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/da3ba6ea-9193-4ab8-a6b3-938f4069334a-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.716945 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-bound-sa-token\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.734844 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dvdxt"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.741284 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8s67\" (UniqueName: \"kubernetes.io/projected/66fe265e-f557-4aff-a055-bb9d0ca82215-kube-api-access-n8s67\") pod \"catalog-operator-68c6474976-n6l8q\" (UID: \"66fe265e-f557-4aff-a055-bb9d0ca82215\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.756618 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.759726 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlx8m\" (UniqueName: \"kubernetes.io/projected/7d3c0cec-4c3b-4af7-880f-9517c8233886-kube-api-access-vlx8m\") pod \"service-ca-operator-777779d784-7kv88\" (UID: \"7d3c0cec-4c3b-4af7-880f-9517c8233886\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765112 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765676 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-certs\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765719 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x66f\" (UniqueName: \"kubernetes.io/projected/e5460cbb-96ab-496c-84cf-e85c25f68fcc-kube-api-access-9x66f\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765781 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bv5l\" (UniqueName: \"kubernetes.io/projected/0086dcd3-2759-447b-907e-926a36e7a25d-kube-api-access-7bv5l\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765822 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa44c47f-e650-4056-9588-51fd98a96b99-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765865 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfnk9\" (UniqueName: \"kubernetes.io/projected/fa44c47f-e650-4056-9588-51fd98a96b99-kube-api-access-bfnk9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765901 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765928 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db9nr\" (UniqueName: \"kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr\") pod \"auto-csr-approver-29566886-xkbwc\" (UID: \"46f98403-30f8-40f6-afa6-6defe5937024\") " pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765952 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5460cbb-96ab-496c-84cf-e85c25f68fcc-config-volume\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.765979 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-node-bootstrap-token\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766004 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-csi-data-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766080 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-mountpoint-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766108 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-plugins-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766146 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766171 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-registration-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766307 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5460cbb-96ab-496c-84cf-e85c25f68fcc-metrics-tls\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766460 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-socket-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766622 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l646x\" (UniqueName: \"kubernetes.io/projected/df1ec349-a277-489a-bc19-5644739e80a1-kube-api-access-l646x\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766687 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx475\" (UniqueName: \"kubernetes.io/projected/7a3728e4-9ce3-4546-9eab-e0b4410532ca-kube-api-access-jx475\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766783 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df1ec349-a277-489a-bc19-5644739e80a1-cert\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766814 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xglz7\" (UniqueName: \"kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.766966 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-mountpoint-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.767465 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-plugins-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.767647 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-socket-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.767669 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-registration-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.768118 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0086dcd3-2759-447b-907e-926a36e7a25d-csi-data-dir\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.768752 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.768856 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.26883122 +0000 UTC m=+212.149857350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.769365 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5460cbb-96ab-496c-84cf-e85c25f68fcc-config-volume\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.771899 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-certs\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.772224 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fa44c47f-e650-4056-9588-51fd98a96b99-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.772437 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.774042 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/df1ec349-a277-489a-bc19-5644739e80a1-cert\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.781021 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7a3728e4-9ce3-4546-9eab-e0b4410532ca-node-bootstrap-token\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.790626 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2tnx\" (UniqueName: \"kubernetes.io/projected/f1adf702-1031-43c9-be15-8297a51e6958-kube-api-access-z2tnx\") pod \"kube-storage-version-migrator-operator-b67b599dd-zzhb8\" (UID: \"f1adf702-1031-43c9-be15-8297a51e6958\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.792133 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5460cbb-96ab-496c-84cf-e85c25f68fcc-metrics-tls\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.794744 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.800008 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8v74\" (UniqueName: \"kubernetes.io/projected/8c4e197a-deea-42d4-a7b1-0d83cc546b76-kube-api-access-p8v74\") pod \"packageserver-d55dfcdfc-rrqx6\" (UID: \"8c4e197a-deea-42d4-a7b1-0d83cc546b76\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.815426 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a2d96dd-d627-4ce2-94c1-16cfcf5115a5-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-nf7bz\" (UID: \"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: W0320 13:26:36.817964 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fe7864_7af6_43c6_ac77_bac39a84b3d7.slice/crio-356bd680ed1ed50c23b5c9e5a7f80288d2d94fcce75e6999baf8d966464af4e9 WatchSource:0}: Error finding container 356bd680ed1ed50c23b5c9e5a7f80288d2d94fcce75e6999baf8d966464af4e9: Status 404 returned error can't find the container with id 356bd680ed1ed50c23b5c9e5a7f80288d2d94fcce75e6999baf8d966464af4e9 Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.844211 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.853011 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.862192 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9zjb\" (UniqueName: \"kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb\") pod \"oauth-openshift-558db77b4-lpbh5\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.862248 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.864507 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" Mar 20 13:26:36 crc kubenswrapper[4856]: W0320 13:26:36.866795 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1d6d53_b4f3_4b21_bd32_b51edb57e5c4.slice/crio-ff53990062952169dd1f8a2c4acf4fa8c3ddb756e792909fe2e0b62cfa94eb7b WatchSource:0}: Error finding container ff53990062952169dd1f8a2c4acf4fa8c3ddb756e792909fe2e0b62cfa94eb7b: Status 404 returned error can't find the container with id ff53990062952169dd1f8a2c4acf4fa8c3ddb756e792909fe2e0b62cfa94eb7b Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.869395 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.871256 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.371243676 +0000 UTC m=+212.252269806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.871361 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbvr\" (UniqueName: \"kubernetes.io/projected/ae4a83e0-1d82-4893-af26-ff6f5741b9a0-kube-api-access-xwbvr\") pod \"downloads-7954f5f757-9pbqq\" (UID: \"ae4a83e0-1d82-4893-af26-ff6f5741b9a0\") " pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.893670 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdwvd\" (UniqueName: \"kubernetes.io/projected/f8c071e4-4f55-46c4-944f-05ba67dec8dd-kube-api-access-qdwvd\") pod \"router-default-5444994796-pmhrr\" (UID: \"f8c071e4-4f55-46c4-944f-05ba67dec8dd\") " pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.899056 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths"] Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.907397 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.915304 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.931644 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4d7\" (UniqueName: \"kubernetes.io/projected/b63b297b-9f19-4062-b26a-5d1888e1280e-kube-api-access-qx4d7\") pod \"openshift-controller-manager-operator-756b6f6bc6-sm6r2\" (UID: \"b63b297b-9f19-4062-b26a-5d1888e1280e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.954619 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km9t4\" (UniqueName: \"kubernetes.io/projected/ee423424-e9c4-489d-a372-3b6a7dff66bd-kube-api-access-km9t4\") pod \"dns-operator-744455d44c-pdv92\" (UID: \"ee423424-e9c4-489d-a372-3b6a7dff66bd\") " pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.966687 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.970225 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:36 crc kubenswrapper[4856]: E0320 13:26:36.970996 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.470973822 +0000 UTC m=+212.351999952 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.974931 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" Mar 20 13:26:36 crc kubenswrapper[4856]: I0320 13:26:36.978129 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlcc\" (UniqueName: \"kubernetes.io/projected/072648eb-9536-49d4-a7a8-411ee27e377b-kube-api-access-xjlcc\") pod \"service-ca-9c57cc56f-26dzg\" (UID: \"072648eb-9536-49d4-a7a8-411ee27e377b\") " pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.000519 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9e6117aa-1e35-4102-8181-6d6e370c4bee-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-p6cdc\" (UID: \"9e6117aa-1e35-4102-8181-6d6e370c4bee\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.030301 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28nm\" (UniqueName: \"kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm\") pod \"marketplace-operator-79b997595-9fh88\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.034524 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.037719 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cvkxl"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.049374 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a807fae-fc6e-4c3e-9de7-a4d1f8b06090-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-gc9r9\" (UID: \"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.053553 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9tk\" (UniqueName: \"kubernetes.io/projected/19e2de31-96c2-4e95-a83c-f086710a9bc0-kube-api-access-qv9tk\") pod \"machine-config-operator-74547568cd-zdppb\" (UID: \"19e2de31-96c2-4e95-a83c-f086710a9bc0\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.073128 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.073211 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5bmj\" (UniqueName: \"kubernetes.io/projected/de948299-5822-4c15-b312-ffc6b83f6cc9-kube-api-access-q5bmj\") pod \"migrator-59844c95c7-b8ltz\" (UID: \"de948299-5822-4c15-b312-ffc6b83f6cc9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.073375 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.073497 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.073842 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.57382907 +0000 UTC m=+212.454855200 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.075119 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.076451 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.093078 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk5sd\" (UniqueName: \"kubernetes.io/projected/b5cb6a22-572a-47fb-978b-b091ab19f2d6-kube-api-access-zk5sd\") pod \"multus-admission-controller-857f4d67dd-6wxk9\" (UID: \"b5cb6a22-572a-47fb-978b-b091ab19f2d6\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.098539 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.103802 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.105019 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.115240 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.119997 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gvn\" (UniqueName: \"kubernetes.io/projected/5dc2fe95-1309-4007-8102-ba43375cf22b-kube-api-access-n9gvn\") pod \"openshift-apiserver-operator-796bbdcf4f-wpdwf\" (UID: \"5dc2fe95-1309-4007-8102-ba43375cf22b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.128686 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.131596 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rf7jq\" (UniqueName: \"kubernetes.io/projected/c645067e-915e-4021-b388-de1c159d99da-kube-api-access-rf7jq\") pod \"ingress-operator-5b745b69d9-4q6ww\" (UID: \"c645067e-915e-4021-b388-de1c159d99da\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.131787 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" event={"ID":"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1","Type":"ContainerStarted","Data":"20b42097747e75cc5d98d920a1b9a963ea9966124eec14ccc52ae6e7970547c3"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.137342 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" event={"ID":"2a2f9676-ad0d-44ce-939a-0c12e90b0d31","Type":"ContainerStarted","Data":"a42628eeb341f3b5f887199c1c7bf01bd80765eb9848bf0c5872a577c9a37e8a"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.156020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2jfr\" (UniqueName: \"kubernetes.io/projected/628df06a-2257-49f7-9d72-5aa490049230-kube-api-access-z2jfr\") pod \"package-server-manager-789f6589d5-bf2lq\" (UID: \"628df06a-2257-49f7-9d72-5aa490049230\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.156572 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.157335 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jwjhv" event={"ID":"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4","Type":"ContainerStarted","Data":"ff53990062952169dd1f8a2c4acf4fa8c3ddb756e792909fe2e0b62cfa94eb7b"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.163953 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" event={"ID":"c3fe7864-7af6-43c6-ac77-bac39a84b3d7","Type":"ContainerStarted","Data":"cb6036a2537d7ec5f5b1b846c88631aca7aca2bcd16eb5d5ea8fba9ae1c9fe62"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.163985 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" event={"ID":"c3fe7864-7af6-43c6-ac77-bac39a84b3d7","Type":"ContainerStarted","Data":"356bd680ed1ed50c23b5c9e5a7f80288d2d94fcce75e6999baf8d966464af4e9"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.164428 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.165727 4856 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-n8vk7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.165764 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.167598 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" event={"ID":"1d65a22a-bf52-43e0-a4c3-60808f60b2e5","Type":"ContainerStarted","Data":"cfcb091f81e6eced96700ed5c05f0c0faa1141c9364462a73feadc17dc72b455"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.167627 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" event={"ID":"1d65a22a-bf52-43e0-a4c3-60808f60b2e5","Type":"ContainerStarted","Data":"eb170be049ac8ddbafb1eed39bd3a25c734de7c34d90f783d40f5cf6b655b462"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.169527 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.171947 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" event={"ID":"89f7865f-b485-4242-a58d-52252234aa99","Type":"ContainerStarted","Data":"a8da31e6da5ae8e7b2b06632dad2d51e50909298aa87430df1b799bc5dff25e4"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.174966 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.175094 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.675075505 +0000 UTC m=+212.556101635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.175134 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.175327 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.175417 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.177368 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.178794 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" event={"ID":"3cfd97a6-abeb-4d68-97c3-751aaa6445f9","Type":"ContainerStarted","Data":"cd153f8012c40c0d856b3569687ff886176a952d6de9156ddbbeada5a0a13ae3"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.178830 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" event={"ID":"3cfd97a6-abeb-4d68-97c3-751aaa6445f9","Type":"ContainerStarted","Data":"c53ab1504696117a55b9d9d7d7f64f352bd483c8636a68647ad0e45dacae8c70"} Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.179203 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.180171 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.680155158 +0000 UTC m=+212.561181288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.185787 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fqq88"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.188493 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.189072 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.189160 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.190495 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccnm9\" (UniqueName: \"kubernetes.io/projected/1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee-kube-api-access-ccnm9\") pod \"etcd-operator-b45778765-2nch8\" (UID: \"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.201252 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.201638 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.215940 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vn4n\" (UniqueName: \"kubernetes.io/projected/c078f2c5-1010-4c6e-852a-65b6d94dfa16-kube-api-access-9vn4n\") pod \"console-operator-58897d9998-vzmd5\" (UID: \"c078f2c5-1010-4c6e-852a-65b6d94dfa16\") " pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:37 crc kubenswrapper[4856]: W0320 13:26:37.220228 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda15d3bc4_898a_48c7_a076_23e0911e635e.slice/crio-c2a86e68f59cda6e0669fabc4360c1f751cc97a965c733b7b7fa90209763b035 WatchSource:0}: Error finding container c2a86e68f59cda6e0669fabc4360c1f751cc97a965c733b7b7fa90209763b035: Status 404 returned error can't find the container with id c2a86e68f59cda6e0669fabc4360c1f751cc97a965c733b7b7fa90209763b035 Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.222615 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:37 crc kubenswrapper[4856]: W0320 13:26:37.224439 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0dbcc4e_e764_48d7_bf60_6d0bf32cc9f5.slice/crio-68b24da090efde4ded7e4e418c98ed94979016d164cd8a6d8164281fe1246db1 WatchSource:0}: Error finding container 68b24da090efde4ded7e4e418c98ed94979016d164cd8a6d8164281fe1246db1: Status 404 returned error can't find the container with id 68b24da090efde4ded7e4e418c98ed94979016d164cd8a6d8164281fe1246db1 Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.235488 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.235825 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6qbh\" (UniqueName: \"kubernetes.io/projected/0178be03-ba26-4fb0-88b7-853a34780442-kube-api-access-g6qbh\") pod \"cluster-samples-operator-665b6dd947-w9qd9\" (UID: \"0178be03-ba26-4fb0-88b7-853a34780442\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.266236 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lps8f\" (UniqueName: \"kubernetes.io/projected/91878935-9c8c-4927-bafe-16719ddb8461-kube-api-access-lps8f\") pod \"olm-operator-6b444d44fb-xtxbv\" (UID: \"91878935-9c8c-4927-bafe-16719ddb8461\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.273219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwl8h\" (UniqueName: \"kubernetes.io/projected/da3ba6ea-9193-4ab8-a6b3-938f4069334a-kube-api-access-xwl8h\") pod \"cluster-image-registry-operator-dc59b4c8b-7dgdc\" (UID: \"da3ba6ea-9193-4ab8-a6b3-938f4069334a\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.275946 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.277388 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.777358288 +0000 UTC m=+212.658384418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.280537 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.298118 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x66f\" (UniqueName: \"kubernetes.io/projected/e5460cbb-96ab-496c-84cf-e85c25f68fcc-kube-api-access-9x66f\") pod \"dns-default-vkb56\" (UID: \"e5460cbb-96ab-496c-84cf-e85c25f68fcc\") " pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.311337 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db9nr\" (UniqueName: \"kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr\") pod \"auto-csr-approver-29566886-xkbwc\" (UID: \"46f98403-30f8-40f6-afa6-6defe5937024\") " pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.335195 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xglz7\" (UniqueName: \"kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7\") pod \"collect-profiles-29566875-w5vwh\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.341126 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.341200 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.341523 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.344326 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.346383 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.356468 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.364662 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfnk9\" (UniqueName: \"kubernetes.io/projected/fa44c47f-e650-4056-9588-51fd98a96b99-kube-api-access-bfnk9\") pod \"control-plane-machine-set-operator-78cbb6b69f-7dbpg\" (UID: \"fa44c47f-e650-4056-9588-51fd98a96b99\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.368219 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.374675 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.376301 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l646x\" (UniqueName: \"kubernetes.io/projected/df1ec349-a277-489a-bc19-5644739e80a1-kube-api-access-l646x\") pod \"ingress-canary-fph5p\" (UID: \"df1ec349-a277-489a-bc19-5644739e80a1\") " pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.377432 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.377752 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.877740821 +0000 UTC m=+212.758766941 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.381312 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.393740 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx475\" (UniqueName: \"kubernetes.io/projected/7a3728e4-9ce3-4546-9eab-e0b4410532ca-kube-api-access-jx475\") pod \"machine-config-server-ttgqh\" (UID: \"7a3728e4-9ce3-4546-9eab-e0b4410532ca\") " pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.410808 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.415457 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bv5l\" (UniqueName: \"kubernetes.io/projected/0086dcd3-2759-447b-907e-926a36e7a25d-kube-api-access-7bv5l\") pod \"csi-hostpathplugin-7wsbq\" (UID: \"0086dcd3-2759-447b-907e-926a36e7a25d\") " pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.478549 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.479629 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lpbh5"] Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.479819 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:37.979805138 +0000 UTC m=+212.860831268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.483415 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.515031 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-7kv88"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.548876 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.551623 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.560341 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.571866 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.581901 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.582191 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.082179163 +0000 UTC m=+212.963205293 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.587705 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.595019 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.603166 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.631578 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fph5p" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.637034 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-ttgqh" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.644662 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.673022 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" podStartSLOduration=163.672988877 podStartE2EDuration="2m43.672988877s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:37.631473636 +0000 UTC m=+212.512499766" watchObservedRunningTime="2026-03-20 13:26:37.672988877 +0000 UTC m=+212.554015007" Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.683433 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.683686 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.183670245 +0000 UTC m=+213.064696375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.785043 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.785598 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.285573708 +0000 UTC m=+213.166599838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.805579 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pdv92"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.810191 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.864474 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.870260 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.889331 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.889547 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.389530084 +0000 UTC m=+213.270556214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.891577 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.891837 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.391829434 +0000 UTC m=+213.272855564 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.913941 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-9pbqq"] Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.951389 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2"] Mar 20 13:26:37 crc kubenswrapper[4856]: W0320 13:26:37.959413 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee423424_e9c4_489d_a372_3b6a7dff66bd.slice/crio-18d0e270abef56bad58e3e212de95827df56a52c598f7d3a175b3699886660d5 WatchSource:0}: Error finding container 18d0e270abef56bad58e3e212de95827df56a52c598f7d3a175b3699886660d5: Status 404 returned error can't find the container with id 18d0e270abef56bad58e3e212de95827df56a52c598f7d3a175b3699886660d5 Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.992889 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.993126 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.49308197 +0000 UTC m=+213.374108100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:37 crc kubenswrapper[4856]: I0320 13:26:37.993319 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:37 crc kubenswrapper[4856]: E0320 13:26:37.993715 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.493699326 +0000 UTC m=+213.374725456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.001437 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-6wxk9"] Mar 20 13:26:38 crc kubenswrapper[4856]: W0320 13:26:38.008565 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc645067e_915e_4021_b388_de1c159d99da.slice/crio-95a1ab40e7e9724eb6315029333d394b504e9fd37ff8eddd47646bf7cf24139f WatchSource:0}: Error finding container 95a1ab40e7e9724eb6315029333d394b504e9fd37ff8eddd47646bf7cf24139f: Status 404 returned error can't find the container with id 95a1ab40e7e9724eb6315029333d394b504e9fd37ff8eddd47646bf7cf24139f Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.046573 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9"] Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.095491 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.096581 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.596561993 +0000 UTC m=+213.477588133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.118390 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-26dzg"] Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.148968 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:26:38 crc kubenswrapper[4856]: W0320 13:26:38.156243 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb63b297b_9f19_4062_b26a_5d1888e1280e.slice/crio-7e4ae554bffc93887a270409d6b2fb03b08bcde13460349c44a2feffbc4e7d85 WatchSource:0}: Error finding container 7e4ae554bffc93887a270409d6b2fb03b08bcde13460349c44a2feffbc4e7d85: Status 404 returned error can't find the container with id 7e4ae554bffc93887a270409d6b2fb03b08bcde13460349c44a2feffbc4e7d85 Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.197648 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.197944 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.697930743 +0000 UTC m=+213.578956873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.227596 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" event={"ID":"36be461e-6bde-44a4-8cbe-35a5c8ef8be8","Type":"ContainerStarted","Data":"a81c268701c3fb7aca821b84a612942677488da790e0c02244fc74c963717196"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.227989 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" event={"ID":"36be461e-6bde-44a4-8cbe-35a5c8ef8be8","Type":"ContainerStarted","Data":"0c12740c97d02408fb82c53b2466f05a6f3c335d951be9a88a8e4fb3f2093ec4"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.248426 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" event={"ID":"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1","Type":"ContainerStarted","Data":"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.248673 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.253665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pmhrr" event={"ID":"f8c071e4-4f55-46c4-944f-05ba67dec8dd","Type":"ContainerStarted","Data":"93747cb35cfa9bea91403144165a1b6b8f08d670bb857f0c271b9f0819535ab1"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.260161 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" event={"ID":"b5cb6a22-572a-47fb-978b-b091ab19f2d6","Type":"ContainerStarted","Data":"90736d10ee82ca23237663b699ba42d827801bd84e306b8d0acf25100db6cd01"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.275823 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.294834 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-xkbwc"] Mar 20 13:26:38 crc kubenswrapper[4856]: W0320 13:26:38.298461 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3728e4_9ce3_4546_9eab_e0b4410532ca.slice/crio-0318ed726964713a75853d2797a2a18c3a841a27d0632ed832c893bb19f197c5 WatchSource:0}: Error finding container 0318ed726964713a75853d2797a2a18c3a841a27d0632ed832c893bb19f197c5: Status 404 returned error can't find the container with id 0318ed726964713a75853d2797a2a18c3a841a27d0632ed832c893bb19f197c5 Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.299053 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.300475 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.800451731 +0000 UTC m=+213.681477861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.312502 4856 generic.go:334] "Generic (PLEG): container finished" podID="89f7865f-b485-4242-a58d-52252234aa99" containerID="d9a637ef1c7132f61b43a83d7200797b2a547ef80e21f497109bbab57a1fd2c2" exitCode=0 Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.312609 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" event={"ID":"89f7865f-b485-4242-a58d-52252234aa99","Type":"ContainerDied","Data":"d9a637ef1c7132f61b43a83d7200797b2a547ef80e21f497109bbab57a1fd2c2"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.325146 4856 generic.go:334] "Generic (PLEG): container finished" podID="a15d3bc4-898a-48c7-a076-23e0911e635e" containerID="3f243bb3801a176154f9fcebb41f9986ba1b562fc0dbba536e5bd5c4dc554eda" exitCode=0 Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.325202 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" event={"ID":"a15d3bc4-898a-48c7-a076-23e0911e635e","Type":"ContainerDied","Data":"3f243bb3801a176154f9fcebb41f9986ba1b562fc0dbba536e5bd5c4dc554eda"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.325227 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" event={"ID":"a15d3bc4-898a-48c7-a076-23e0911e635e","Type":"ContainerStarted","Data":"c2a86e68f59cda6e0669fabc4360c1f751cc97a965c733b7b7fa90209763b035"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.330408 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz"] Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.343111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" event={"ID":"8c4e197a-deea-42d4-a7b1-0d83cc546b76","Type":"ContainerStarted","Data":"40d31e228a30a5ffa943c439f0b9ee14598e4f7dfdbf9a759239db1211d0eb20"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.358117 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" event={"ID":"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5","Type":"ContainerStarted","Data":"0516bf285f34c3e003bd4e115e5b37e8f94e646eb4b020b6e476abd8fc915537"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.358154 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" event={"ID":"1a2d96dd-d627-4ce2-94c1-16cfcf5115a5","Type":"ContainerStarted","Data":"c6ef9c56d34ca151e8251270338f8b80df7ff2093acbff15fd15feb2e255c5db"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.400458 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.402931 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:38.902914539 +0000 UTC m=+213.783940669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.436971 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" event={"ID":"628df06a-2257-49f7-9d72-5aa490049230","Type":"ContainerStarted","Data":"893f893571e81dcfa4dd9358d74df8c835dc1075f820b2aa3b33a7a907fc845d"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.447747 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" event={"ID":"7d3c0cec-4c3b-4af7-880f-9517c8233886","Type":"ContainerStarted","Data":"567ce71379e2ff7c4d8c1d7afa5af2218b6def47a8d292fbf49f45a05d6cfa29"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.453868 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" event={"ID":"798f1ea0-5ae3-41a3-b063-d7014df08ced","Type":"ContainerStarted","Data":"c37bb491c123491a3b579aeed3be3ddea8427919f92ed1caf05771f94fe73b63"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.455622 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jwjhv" event={"ID":"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4","Type":"ContainerStarted","Data":"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.456292 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" event={"ID":"c645067e-915e-4021-b388-de1c159d99da","Type":"ContainerStarted","Data":"95a1ab40e7e9724eb6315029333d394b504e9fd37ff8eddd47646bf7cf24139f"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.478232 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" event={"ID":"19e2de31-96c2-4e95-a83c-f086710a9bc0","Type":"ContainerStarted","Data":"6d1d9510c663a157f722e6eb7ed33f60cd2f6863b411cf9368f5671a639e3796"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.482365 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" event={"ID":"b63b297b-9f19-4062-b26a-5d1888e1280e","Type":"ContainerStarted","Data":"7e4ae554bffc93887a270409d6b2fb03b08bcde13460349c44a2feffbc4e7d85"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.485647 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" event={"ID":"66fe265e-f557-4aff-a055-bb9d0ca82215","Type":"ContainerStarted","Data":"a876c9d28ee14fc921575f8403e2d8b0863e41b15e7c5407fc80b1c5cf24ed49"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.488808 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" event={"ID":"2a2f9676-ad0d-44ce-939a-0c12e90b0d31","Type":"ContainerStarted","Data":"bbfde7c1c371b5cb5d760bc2c53d80a195971be01416dc2016a1a71bc333a60e"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.488834 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" event={"ID":"2a2f9676-ad0d-44ce-939a-0c12e90b0d31","Type":"ContainerStarted","Data":"35c605112cc69335be00240e0f8bdb7810abfcddfc9cf68c49d038564ea66504"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.501756 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.502004 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.001987808 +0000 UTC m=+213.883013938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.507239 4856 generic.go:334] "Generic (PLEG): container finished" podID="1d65a22a-bf52-43e0-a4c3-60808f60b2e5" containerID="cfcb091f81e6eced96700ed5c05f0c0faa1141c9364462a73feadc17dc72b455" exitCode=0 Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.507372 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" event={"ID":"1d65a22a-bf52-43e0-a4c3-60808f60b2e5","Type":"ContainerDied","Data":"cfcb091f81e6eced96700ed5c05f0c0faa1141c9364462a73feadc17dc72b455"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.507458 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" event={"ID":"1d65a22a-bf52-43e0-a4c3-60808f60b2e5","Type":"ContainerStarted","Data":"590f0fb39a472ddf0977f045af9687da0b507ddfb7047617dc6cb8a723c1e3cc"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.513910 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.513932 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" event={"ID":"f1adf702-1031-43c9-be15-8297a51e6958","Type":"ContainerStarted","Data":"5a4982cae144d71f85db3142a4418b1f3c85ab9340c01dd961680553f2ad21af"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.514725 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" event={"ID":"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5","Type":"ContainerStarted","Data":"68b24da090efde4ded7e4e418c98ed94979016d164cd8a6d8164281fe1246db1"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.525816 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.526871 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" event={"ID":"9e6117aa-1e35-4102-8181-6d6e370c4bee","Type":"ContainerStarted","Data":"bf2a3908fdeb081d49d7ceabd6ae82f1e6e7fee8766f4b2d813a5674c6018722"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.558728 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pbqq" event={"ID":"ae4a83e0-1d82-4893-af26-ff6f5741b9a0","Type":"ContainerStarted","Data":"b3ec72d01850305c3a6968519c529fa2a5d9901360a713f3483c4efdd27a45e2"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.584388 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" event={"ID":"ee423424-e9c4-489d-a372-3b6a7dff66bd","Type":"ContainerStarted","Data":"18d0e270abef56bad58e3e212de95827df56a52c598f7d3a175b3699886660d5"} Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.605583 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.606685 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.106669163 +0000 UTC m=+213.987695373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.611633 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf"] Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.656027 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dvdxt" podStartSLOduration=164.656005277 podStartE2EDuration="2m44.656005277s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:38.628246095 +0000 UTC m=+213.509272235" watchObservedRunningTime="2026-03-20 13:26:38.656005277 +0000 UTC m=+213.537031407" Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.679965 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.707448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.710211 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.210189138 +0000 UTC m=+214.091215278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.821694 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.822599 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.322578654 +0000 UTC m=+214.203604794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.829961 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vkb56"] Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.926445 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.926623 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.426603661 +0000 UTC m=+214.307629791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:38 crc kubenswrapper[4856]: I0320 13:26:38.926851 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:38 crc kubenswrapper[4856]: E0320 13:26:38.927162 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.427154846 +0000 UTC m=+214.308180976 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.028240 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.028860 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.528843843 +0000 UTC m=+214.409869973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.064468 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jwjhv" podStartSLOduration=165.064451951 podStartE2EDuration="2m45.064451951s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.044318196 +0000 UTC m=+213.925344326" watchObservedRunningTime="2026-03-20 13:26:39.064451951 +0000 UTC m=+213.945478081" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.065909 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-vzmd5"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.136193 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.136753 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.636740672 +0000 UTC m=+214.517766802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.152218 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" podStartSLOduration=165.152201674 podStartE2EDuration="2m45.152201674s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.118900037 +0000 UTC m=+213.999926167" watchObservedRunningTime="2026-03-20 13:26:39.152201674 +0000 UTC m=+214.033227804" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.188939 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-c2hg7" podStartSLOduration=165.18892373 podStartE2EDuration="2m45.18892373s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.188255863 +0000 UTC m=+214.069281993" watchObservedRunningTime="2026-03-20 13:26:39.18892373 +0000 UTC m=+214.069949860" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.242735 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.243239 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.743221494 +0000 UTC m=+214.624247624 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: W0320 13:26:39.268589 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc078f2c5_1010_4c6e_852a_65b6d94dfa16.slice/crio-17ab754240e98b2c2219af5ae8fa0ea924adc93c85fd9b5989f991b43fb33761 WatchSource:0}: Error finding container 17ab754240e98b2c2219af5ae8fa0ea924adc93c85fd9b5989f991b43fb33761: Status 404 returned error can't find the container with id 17ab754240e98b2c2219af5ae8fa0ea924adc93c85fd9b5989f991b43fb33761 Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.298577 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-nf7bz" podStartSLOduration=165.298559005 podStartE2EDuration="2m45.298559005s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.27495388 +0000 UTC m=+214.155980010" watchObservedRunningTime="2026-03-20 13:26:39.298559005 +0000 UTC m=+214.179585135" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.317592 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" podStartSLOduration=165.31757420899999 podStartE2EDuration="2m45.317574209s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.314501849 +0000 UTC m=+214.195527979" watchObservedRunningTime="2026-03-20 13:26:39.317574209 +0000 UTC m=+214.198600339" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.319238 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fph5p"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.341165 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2nch8"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.344880 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.345125 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.845111776 +0000 UTC m=+214.726137906 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.445741 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.445992 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:39.945978202 +0000 UTC m=+214.827004332 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.489969 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.546967 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.547293 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.047280769 +0000 UTC m=+214.928306899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.555196 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.556886 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.574210 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.576011 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh"] Mar 20 13:26:39 crc kubenswrapper[4856]: W0320 13:26:39.586764 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-f5787b4d43f6557ebc33a7aba66455f805a70605106c021e40ec8c9086a3973d WatchSource:0}: Error finding container f5787b4d43f6557ebc33a7aba66455f805a70605106c021e40ec8c9086a3973d: Status 404 returned error can't find the container with id f5787b4d43f6557ebc33a7aba66455f805a70605106c021e40ec8c9086a3973d Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.612287 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-7wsbq"] Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.652664 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.653531 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.153509084 +0000 UTC m=+215.034535214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.696207 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" event={"ID":"46f98403-30f8-40f6-afa6-6defe5937024","Type":"ContainerStarted","Data":"c57fd1993efc2d36b45797f84736b8ae5e64ca628dff5b1afe6e152dc5eb7765"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.714291 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"23f79a59046fad49487a3214ea6e4b16b0b00bcd21a1199d9fdcdaa7c43bf743"} Mar 20 13:26:39 crc kubenswrapper[4856]: W0320 13:26:39.720586 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda3ba6ea_9193_4ab8_a6b3_938f4069334a.slice/crio-308f9b6327ed4fffc95dd84e10b810a8c97b0d0f645a2d05a2ed0c58f2361391 WatchSource:0}: Error finding container 308f9b6327ed4fffc95dd84e10b810a8c97b0d0f645a2d05a2ed0c58f2361391: Status 404 returned error can't find the container with id 308f9b6327ed4fffc95dd84e10b810a8c97b0d0f645a2d05a2ed0c58f2361391 Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.741390 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" event={"ID":"072648eb-9536-49d4-a7a8-411ee27e377b","Type":"ContainerStarted","Data":"c2db419c80807e68117dcc257d0d10e2e17351ee807692e6295341f87cf08efb"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.746396 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" event={"ID":"9e6117aa-1e35-4102-8181-6d6e370c4bee","Type":"ContainerStarted","Data":"e3e4f37aca226d54235e29fcf08e5586c2ca8a66d870c988f24a589568619bb2"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.755144 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.755452 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.255440158 +0000 UTC m=+215.136466288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.778169 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-p6cdc" podStartSLOduration=165.778154619 podStartE2EDuration="2m45.778154619s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.775955592 +0000 UTC m=+214.656981722" watchObservedRunningTime="2026-03-20 13:26:39.778154619 +0000 UTC m=+214.659180749" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.790918 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" event={"ID":"8c4e197a-deea-42d4-a7b1-0d83cc546b76","Type":"ContainerStarted","Data":"c523b5d8d57e3ad1d16d020959537e3f6e4d21fe5525131b8d5734156eac157e"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.791226 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.846288 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51090: no serving certificate available for the kubelet" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.857042 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.858157 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.358140431 +0000 UTC m=+215.239166561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.862933 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" event={"ID":"66fe265e-f557-4aff-a055-bb9d0ca82215","Type":"ContainerStarted","Data":"bbb0d6c16b9b659b821680a81a8d13d02a3d159b0d5dfc5b1ac73ccc9965daca"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.864106 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.884878 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" event={"ID":"19e2de31-96c2-4e95-a83c-f086710a9bc0","Type":"ContainerStarted","Data":"4681c767ba027ffe849600f1f312e9be33057d710800ef376867af72548005dd"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.887821 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.895348 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" podStartSLOduration=165.89533418 podStartE2EDuration="2m45.89533418s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.812835442 +0000 UTC m=+214.693861572" watchObservedRunningTime="2026-03-20 13:26:39.89533418 +0000 UTC m=+214.776360310" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.901393 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-n6l8q" podStartSLOduration=165.901380607 podStartE2EDuration="2m45.901380607s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.894534299 +0000 UTC m=+214.775560429" watchObservedRunningTime="2026-03-20 13:26:39.901380607 +0000 UTC m=+214.782406737" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.924431 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" event={"ID":"b63b297b-9f19-4062-b26a-5d1888e1280e","Type":"ContainerStarted","Data":"e5dd4a7a7a1be47a8b5bacc98bd79845f377f8495748c4cd8031a64a04986fc0"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.936460 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b545ff459d3829f9647c98ce7929381c8054499b476eb7fc1bce901e269af986"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.951569 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" event={"ID":"628df06a-2257-49f7-9d72-5aa490049230","Type":"ContainerStarted","Data":"0f544bbb47129c486b9098c3c1bd1088d1f3c6a8492039ada08b0bb15405c047"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.955612 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51094: no serving certificate available for the kubelet" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.958522 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.958748 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" event={"ID":"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee","Type":"ContainerStarted","Data":"09b597c6246329e15b0e4c0c512c2c0fb8266955ea0d97d3fbf02bb09f6d625d"} Mar 20 13:26:39 crc kubenswrapper[4856]: E0320 13:26:39.959150 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.459138911 +0000 UTC m=+215.340165031 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.966492 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" event={"ID":"798f1ea0-5ae3-41a3-b063-d7014df08ced","Type":"ContainerStarted","Data":"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.967566 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.968828 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-sm6r2" podStartSLOduration=165.968813032 podStartE2EDuration="2m45.968813032s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:39.963237368 +0000 UTC m=+214.844263488" watchObservedRunningTime="2026-03-20 13:26:39.968813032 +0000 UTC m=+214.849839182" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.970498 4856 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-lpbh5 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" start-of-body= Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.970565 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.22:6443/healthz\": dial tcp 10.217.0.22:6443: connect: connection refused" Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.986706 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pmhrr" event={"ID":"f8c071e4-4f55-46c4-944f-05ba67dec8dd","Type":"ContainerStarted","Data":"9d189658e77ab7faa2466250f84da71c3305ebb1982ee00c1814389de79cfdab"} Mar 20 13:26:39 crc kubenswrapper[4856]: I0320 13:26:39.999932 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" event={"ID":"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5","Type":"ContainerStarted","Data":"bacd411ad59abdc9e319f3af997f40954ac951e27c110c87ced3030ff96a3368"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.016393 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" event={"ID":"de948299-5822-4c15-b312-ffc6b83f6cc9","Type":"ContainerStarted","Data":"8048d958e8b99367e041bac679fe37ee1d3037159a0fe4e26921d999680aaa19"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.036013 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" podStartSLOduration=166.035997411 podStartE2EDuration="2m46.035997411s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.004298506 +0000 UTC m=+214.885324636" watchObservedRunningTime="2026-03-20 13:26:40.035997411 +0000 UTC m=+214.917023541" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.036633 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51108: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.037553 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" event={"ID":"0178be03-ba26-4fb0-88b7-853a34780442","Type":"ContainerStarted","Data":"bbcc8eb2dfe779d5e6a9fe8f6e2e682a14280e03f80c5871ff90348dbb965a17"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.048665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" event={"ID":"f1adf702-1031-43c9-be15-8297a51e6958","Type":"ContainerStarted","Data":"3db76e28b423583182754044e66d96ba41f22fb5acb5e40288a232b5b0e30906"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.059716 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.061005 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.560980202 +0000 UTC m=+215.442006382 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.069260 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pmhrr" podStartSLOduration=166.069239957 podStartE2EDuration="2m46.069239957s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.036984267 +0000 UTC m=+214.918010397" watchObservedRunningTime="2026-03-20 13:26:40.069239957 +0000 UTC m=+214.950266087" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.069749 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" podStartSLOduration=166.06974276 podStartE2EDuration="2m46.06974276s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.069591516 +0000 UTC m=+214.950617646" watchObservedRunningTime="2026-03-20 13:26:40.06974276 +0000 UTC m=+214.950768890" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.074734 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-9pbqq" event={"ID":"ae4a83e0-1d82-4893-af26-ff6f5741b9a0","Type":"ContainerStarted","Data":"be12b111bdcad93af0a9647bf02f2a10b62c0312f0a42505a9d7434fa85d5ec6"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.082337 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.083435 4856 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pbqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.083487 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pbqq" podUID="ae4a83e0-1d82-4893-af26-ff6f5741b9a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.091746 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkb56" event={"ID":"e5460cbb-96ab-496c-84cf-e85c25f68fcc","Type":"ContainerStarted","Data":"b6216e2be08ad2aeafce983cd80d91dc620746277ca146d357690670d687a5a0"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.114389 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" event={"ID":"ee423424-e9c4-489d-a372-3b6a7dff66bd","Type":"ContainerStarted","Data":"ce00ef41a3e37a4d0e467eb2e7b012503fd11284976d3e78ea4b6b93fc4c2333"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.128768 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" event={"ID":"36be461e-6bde-44a4-8cbe-35a5c8ef8be8","Type":"ContainerStarted","Data":"40e48c2a3b46a01cc536514a96460277fdaf32702db0b7121cc62878ed73027d"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.131336 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zzhb8" podStartSLOduration=166.131325854 podStartE2EDuration="2m46.131325854s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.101772303 +0000 UTC m=+214.982798433" watchObservedRunningTime="2026-03-20 13:26:40.131325854 +0000 UTC m=+215.012351984" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.142418 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-9pbqq" podStartSLOduration=166.142397771 podStartE2EDuration="2m46.142397771s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.130357508 +0000 UTC m=+215.011383648" watchObservedRunningTime="2026-03-20 13:26:40.142397771 +0000 UTC m=+215.023423901" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.142978 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.147345 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" event={"ID":"c078f2c5-1010-4c6e-852a-65b6d94dfa16","Type":"ContainerStarted","Data":"17ab754240e98b2c2219af5ae8fa0ea924adc93c85fd9b5989f991b43fb33761"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.148956 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" event={"ID":"5dc2fe95-1309-4007-8102-ba43375cf22b","Type":"ContainerStarted","Data":"e959c8bbc0d7788f10e66ff6c10eb9a645d50b26562a73ce9a85c20eef1933e7"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.155766 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cvkxl" podStartSLOduration=166.155752239 podStartE2EDuration="2m46.155752239s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.154450225 +0000 UTC m=+215.035476365" watchObservedRunningTime="2026-03-20 13:26:40.155752239 +0000 UTC m=+215.036778369" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.156718 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerStarted","Data":"3e616392d2def6157d1b434a1e69132620a3fb9d628c013babbfa8d1972ae1a4"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.157690 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.159984 4856 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9fh88 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.160085 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.162365 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.163597 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.663579563 +0000 UTC m=+215.544605773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.170940 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.174975 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51122: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.176468 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:40 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:40 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:40 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.176663 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.194828 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ttgqh" event={"ID":"7a3728e4-9ce3-4546-9eab-e0b4410532ca","Type":"ContainerStarted","Data":"0318ed726964713a75853d2797a2a18c3a841a27d0632ed832c893bb19f197c5"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.219315 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" event={"ID":"7d3c0cec-4c3b-4af7-880f-9517c8233886","Type":"ContainerStarted","Data":"3f15ed72d0d790e7f1216bc35071257160110accf09fc2234a7c3dfc0670a533"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.249783 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51138: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.264975 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.266055 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.76603799 +0000 UTC m=+215.647064120 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.279981 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f5787b4d43f6557ebc33a7aba66455f805a70605106c021e40ec8c9086a3973d"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.281469 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-7kv88" podStartSLOduration=166.281459322 podStartE2EDuration="2m46.281459322s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.281069102 +0000 UTC m=+215.162095262" watchObservedRunningTime="2026-03-20 13:26:40.281459322 +0000 UTC m=+215.162485452" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.346543 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51142: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.347402 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-ttgqh" podStartSLOduration=6.347389718 podStartE2EDuration="6.347389718s" podCreationTimestamp="2026-03-20 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.321482963 +0000 UTC m=+215.202509103" watchObservedRunningTime="2026-03-20 13:26:40.347389718 +0000 UTC m=+215.228415848" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.348372 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" podStartSLOduration=166.348367594 podStartE2EDuration="2m46.348367594s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:40.347546002 +0000 UTC m=+215.228572132" watchObservedRunningTime="2026-03-20 13:26:40.348367594 +0000 UTC m=+215.229393714" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.354364 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fph5p" event={"ID":"df1ec349-a277-489a-bc19-5644739e80a1","Type":"ContainerStarted","Data":"eeb70446382a6dea32726fd095710d5b4e7388dd57976d9df59007ee81ca8497"} Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.371831 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.374613 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2jgcz" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.375486 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.875474159 +0000 UTC m=+215.756500289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.454685 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51154: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.473429 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.473718 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.973692826 +0000 UTC m=+215.854718946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.473759 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.474528 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:40.974521018 +0000 UTC m=+215.855547148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.578562 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.578932 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.078915225 +0000 UTC m=+215.959941355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.583103 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51170: no serving certificate available for the kubelet" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.681035 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.681464 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.181451284 +0000 UTC m=+216.062477414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.782698 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.782792 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.282775582 +0000 UTC m=+216.163801712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.785622 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.786029 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.286016336 +0000 UTC m=+216.167042466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.792364 4856 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-rrqx6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.792724 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" podUID="8c4e197a-deea-42d4-a7b1-0d83cc546b76" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.887202 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.887490 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.387474868 +0000 UTC m=+216.268500998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:40 crc kubenswrapper[4856]: I0320 13:26:40.988717 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:40 crc kubenswrapper[4856]: E0320 13:26:40.989034 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.489018051 +0000 UTC m=+216.370044181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.078565 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.079709 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.087067 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.090704 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.090969 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.590950624 +0000 UTC m=+216.471976754 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.092637 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.177261 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:41 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:41 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:41 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.177353 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.194999 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk5fw\" (UniqueName: \"kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.195074 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.195127 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.195151 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.195546 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.695534517 +0000 UTC m=+216.576560647 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.290911 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51184: no serving certificate available for the kubelet" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.303177 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.303364 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk5fw\" (UniqueName: \"kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.303451 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.303469 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.304106 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.304174 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.804160444 +0000 UTC m=+216.685186574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.304589 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.338628 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk5fw\" (UniqueName: \"kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw\") pod \"community-operators-w59xx\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.406444 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.407062 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:41.907049193 +0000 UTC m=+216.788075323 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.409112 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" event={"ID":"0178be03-ba26-4fb0-88b7-853a34780442","Type":"ContainerStarted","Data":"0216f73f07e09ccf1e084614b2fc7171a207a7bbc6f76c4d92c5a8497af9aa2f"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.409155 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" event={"ID":"0178be03-ba26-4fb0-88b7-853a34780442","Type":"ContainerStarted","Data":"6cd59b255320206382849eec01d6f9ad2c9ef84d29ee6ce0428918fa0c85ae05"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.418525 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.433017 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" event={"ID":"91878935-9c8c-4927-bafe-16719ddb8461","Type":"ContainerStarted","Data":"89bb83430ee01483dc1b75963a9f5b85674b16000140d111b9f14bfb0b0b2ed4"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.433063 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" event={"ID":"91878935-9c8c-4927-bafe-16719ddb8461","Type":"ContainerStarted","Data":"a6a44940a31b6453e4352dcca2bfcfa0e15b77dbfa00f2e85144ca797897c908"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.433939 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.436870 4856 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-xtxbv container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.436901 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" podUID="91878935-9c8c-4927-bafe-16719ddb8461" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.474998 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" podStartSLOduration=167.474981791 podStartE2EDuration="2m47.474981791s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.464068937 +0000 UTC m=+216.345095077" watchObservedRunningTime="2026-03-20 13:26:41.474981791 +0000 UTC m=+216.356007921" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.476665 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-w9qd9" podStartSLOduration=167.476657445 podStartE2EDuration="2m47.476657445s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.439793645 +0000 UTC m=+216.320819775" watchObservedRunningTime="2026-03-20 13:26:41.476657445 +0000 UTC m=+216.357683575" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.479485 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.480331 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.482259 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"eb3c9e073a9d8906e2d0d1a6799d7c06d259b37fd588ae707bbdb149dc77e241"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.482320 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.488938 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" event={"ID":"0086dcd3-2759-447b-907e-926a36e7a25d","Type":"ContainerStarted","Data":"a2d5444ad6329f0690843eaf357e59d3133f10df625d95ab582cab91dbbd98c6"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.494947 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.498038 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" event={"ID":"b5cb6a22-572a-47fb-978b-b091ab19f2d6","Type":"ContainerStarted","Data":"3f030c0f022f01d151a817f3382080e37b32cd913a45207cb00c9db52218d1ea"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.512010 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.512250 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.512325 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdvkj\" (UniqueName: \"kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.512343 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.515398 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.015352012 +0000 UTC m=+216.896378212 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.523657 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" event={"ID":"ee423424-e9c4-489d-a372-3b6a7dff66bd","Type":"ContainerStarted","Data":"d735a301ac61a8b92efce991673545b09c7dc85e21cec0052df8bd4423d147c1"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.577212 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerStarted","Data":"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.577761 4856 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9fh88 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.577795 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.624154 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.624209 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.624260 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdvkj\" (UniqueName: \"kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.624295 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.625600 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.625879 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pdv92" podStartSLOduration=167.625858659 podStartE2EDuration="2m47.625858659s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.623070507 +0000 UTC m=+216.504096627" watchObservedRunningTime="2026-03-20 13:26:41.625858659 +0000 UTC m=+216.506884789" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.626765 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.626977 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.126963958 +0000 UTC m=+217.007990088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.628460 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" event={"ID":"89f7865f-b485-4242-a58d-52252234aa99","Type":"ContainerStarted","Data":"97cd15b53d1a65df3abc634d8d3813131ac525ac719cb702991008ff1dafa6f2"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.634457 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkb56" event={"ID":"e5460cbb-96ab-496c-84cf-e85c25f68fcc","Type":"ContainerStarted","Data":"89e071af93c81f35d417cbc439f80187ca07e89e8b4474e6f1799c9d56b15162"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.649715 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" event={"ID":"19e2de31-96c2-4e95-a83c-f086710a9bc0","Type":"ContainerStarted","Data":"e50cf3c11baf62ce10b4d6eb0fea459ec557df259bcf66c269c791053100cd14"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.668296 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" podStartSLOduration=167.668278503 podStartE2EDuration="2m47.668278503s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.66623062 +0000 UTC m=+216.547256750" watchObservedRunningTime="2026-03-20 13:26:41.668278503 +0000 UTC m=+216.549304633" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.679643 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q9k5z" event={"ID":"d0dbcc4e-e764-48d7-bf60-6d0bf32cc9f5","Type":"ContainerStarted","Data":"64ee80ebfe93fe94fb4f5628a19259f88b8ef6438119f726d9f35f54a6f0ebb4"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.680593 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdvkj\" (UniqueName: \"kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj\") pod \"community-operators-m6lhf\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.692175 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.693214 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.695618 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.701958 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.704687 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-ttgqh" event={"ID":"7a3728e4-9ce3-4546-9eab-e0b4410532ca","Type":"ContainerStarted","Data":"dd66aff9a2054aa9c33c2771ab8ba1ae15deac129092b6e07a816667e9395782"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.706158 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-zdppb" podStartSLOduration=167.706134909 podStartE2EDuration="2m47.706134909s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.695878682 +0000 UTC m=+216.576904822" watchObservedRunningTime="2026-03-20 13:26:41.706134909 +0000 UTC m=+216.587161039" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.719219 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" event={"ID":"c078f2c5-1010-4c6e-852a-65b6d94dfa16","Type":"ContainerStarted","Data":"f232956c859d6bfdc9025898412ef8770be0fceb4fbdc18afc95f6608000aa97"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.720035 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.730592 4856 patch_prober.go:28] interesting pod/console-operator-58897d9998-vzmd5 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.730635 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" podUID="c078f2c5-1010-4c6e-852a-65b6d94dfa16" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/readyz\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.743418 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.744933 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.244918039 +0000 UTC m=+217.125944169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.756774 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" podStartSLOduration=167.756758786 podStartE2EDuration="2m47.756758786s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.754994951 +0000 UTC m=+216.636021091" watchObservedRunningTime="2026-03-20 13:26:41.756758786 +0000 UTC m=+216.637784906" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.762127 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" event={"ID":"5dc2fe95-1309-4007-8102-ba43375cf22b","Type":"ContainerStarted","Data":"663be82a990cf0719a5b93ba5ef11010f3056c64ef75e259bda0139c94cd1b25"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.765970 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" event={"ID":"1f0acb35-a304-43e3-8306-5c5319d0e8e8","Type":"ContainerStarted","Data":"2f1a52b951c5480ea3fd08fab44d8c11ef8195ee8461d4246c8df67127bf2522"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.765994 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" event={"ID":"1f0acb35-a304-43e3-8306-5c5319d0e8e8","Type":"ContainerStarted","Data":"4dda566d1e6539831985cb538e5f574f195af6d8ef80e1a2eba827a80b900bb4"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.768617 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e30f34749b95e5ff58cfaad9f700302f7cbd887b68f1d7d35504882dc75c8c87"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.774498 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" event={"ID":"de948299-5822-4c15-b312-ffc6b83f6cc9","Type":"ContainerStarted","Data":"5c398b9f6db8a7a5bb00e600dffba0865ead14c88c4382e642aec3629d883e0b"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.774535 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" event={"ID":"de948299-5822-4c15-b312-ffc6b83f6cc9","Type":"ContainerStarted","Data":"dc4a8104c9a55a4cd1fae772d24b280c189d2e17e60ab0163beecb2cd5d9747c"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.797607 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" event={"ID":"072648eb-9536-49d4-a7a8-411ee27e377b","Type":"ContainerStarted","Data":"c6e9f165a69a82e98f32483baa01076fbef46fc71c59497433575aa86610e6ce"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.805245 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wpdwf" podStartSLOduration=167.805228389 podStartE2EDuration="2m47.805228389s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.803960985 +0000 UTC m=+216.684987135" watchObservedRunningTime="2026-03-20 13:26:41.805228389 +0000 UTC m=+216.686254519" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.827958 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" podStartSLOduration=167.82794037 podStartE2EDuration="2m47.82794037s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.827065277 +0000 UTC m=+216.708091407" watchObservedRunningTime="2026-03-20 13:26:41.82794037 +0000 UTC m=+216.708966520" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.833551 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" event={"ID":"fa44c47f-e650-4056-9588-51fd98a96b99","Type":"ContainerStarted","Data":"b2d7766fb7612eb486757d1905af480860b198d580b71ff2a78b0b200deadf47"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.839600 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.850478 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.850539 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.850714 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.850731 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dddz\" (UniqueName: \"kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.855421 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.355390605 +0000 UTC m=+217.236416735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.861521 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.862463 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.871043 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" event={"ID":"da3ba6ea-9193-4ab8-a6b3-938f4069334a","Type":"ContainerStarted","Data":"bdf085c4853e5c066103eabd8461c2d920c9e36f15f37b44b69aac9d6f0fdf69"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.871089 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" event={"ID":"da3ba6ea-9193-4ab8-a6b3-938f4069334a","Type":"ContainerStarted","Data":"308f9b6327ed4fffc95dd84e10b810a8c97b0d0f645a2d05a2ed0c58f2361391"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.883465 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.884303 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-b8ltz" podStartSLOduration=167.884288177 podStartE2EDuration="2m47.884288177s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.880009015 +0000 UTC m=+216.761035145" watchObservedRunningTime="2026-03-20 13:26:41.884288177 +0000 UTC m=+216.765314307" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.894231 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" event={"ID":"a15d3bc4-898a-48c7-a076-23e0911e635e","Type":"ContainerStarted","Data":"b8521279ca563cbb8bb1bb1a190df9141b4181d437f204dff34ae06b66cafd07"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.895992 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" event={"ID":"1d6cc99e-aa96-47ce-ac9a-8e1a8f222aee","Type":"ContainerStarted","Data":"5192b8dcd2f8e18fcc274dc488b73498bcebeb7158588c81bc34b173b916183a"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.900883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" event={"ID":"c645067e-915e-4021-b388-de1c159d99da","Type":"ContainerStarted","Data":"03f12a9156017b9adc514a89dd0c6ea24734f2fb859f59d1363df6b15e4b1b97"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.900910 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" event={"ID":"c645067e-915e-4021-b388-de1c159d99da","Type":"ContainerStarted","Data":"add79816de665bc7f15c083a96c45524a9cf7cc62ee148c77723031484ded591"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.938311 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fph5p" event={"ID":"df1ec349-a277-489a-bc19-5644739e80a1","Type":"ContainerStarted","Data":"3347a9360a548201c13db074c3a9aaf5f1e93c21e22ebb9ee5d688aa44fe2612"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.944508 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-26dzg" podStartSLOduration=167.944492374 podStartE2EDuration="2m47.944492374s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.93934024 +0000 UTC m=+216.820366380" watchObservedRunningTime="2026-03-20 13:26:41.944492374 +0000 UTC m=+216.825518504" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.954950 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955138 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955192 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcc92\" (UniqueName: \"kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955433 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955457 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955522 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.955539 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dddz\" (UniqueName: \"kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: E0320 13:26:41.956401 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.456385884 +0000 UTC m=+217.337412004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.956715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.958128 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.977573 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" event={"ID":"628df06a-2257-49f7-9d72-5aa490049230","Type":"ContainerStarted","Data":"6a9b0a32918c298c278901f980bc3ae76c3168461d5289a846b99abe0f7494d6"} Mar 20 13:26:41 crc kubenswrapper[4856]: I0320 13:26:41.977628 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.014494 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dddz\" (UniqueName: \"kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz\") pod \"certified-operators-dqwl2\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.021287 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"78736fc3a2cb1265486f9819afb4dd9bc8b2d87bd1cb5cde421450cc166c2496"} Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.033387 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2nch8" podStartSLOduration=168.033370738 podStartE2EDuration="2m48.033370738s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.032925226 +0000 UTC m=+216.913951366" watchObservedRunningTime="2026-03-20 13:26:42.033370738 +0000 UTC m=+216.914396868" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.035224 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.045365 4856 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pbqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.045412 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pbqq" podUID="ae4a83e0-1d82-4893-af26-ff6f5741b9a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.045665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" event={"ID":"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090","Type":"ContainerStarted","Data":"6d5ad511d059e17ea55f66e7db5e6971f7f6aa83f9537d8b1ed308d5d9d8c14c"} Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.057765 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.057801 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.057879 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.057923 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcc92\" (UniqueName: \"kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.058957 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.059156 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.060165 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.560154265 +0000 UTC m=+217.441180395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.065819 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.068416 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-4q6ww" podStartSLOduration=168.06840027 podStartE2EDuration="2m48.06840027s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.06614455 +0000 UTC m=+216.947170680" watchObservedRunningTime="2026-03-20 13:26:42.06840027 +0000 UTC m=+216.949426400" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.100844 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.103726 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-rrqx6" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.107010 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcc92\" (UniqueName: \"kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92\") pod \"certified-operators-ngck4\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.128316 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7dgdc" podStartSLOduration=168.128301279 podStartE2EDuration="2m48.128301279s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.125796504 +0000 UTC m=+217.006822624" watchObservedRunningTime="2026-03-20 13:26:42.128301279 +0000 UTC m=+217.009327409" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.158702 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.207013 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.706991257 +0000 UTC m=+217.588017387 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.226099 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:42 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:42 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:42 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.226168 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.233285 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fph5p" podStartSLOduration=8.233248991 podStartE2EDuration="8.233248991s" podCreationTimestamp="2026-03-20 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.202501201 +0000 UTC m=+217.083527351" watchObservedRunningTime="2026-03-20 13:26:42.233248991 +0000 UTC m=+217.114275121" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.239452 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.280551 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.280855 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.78084306 +0000 UTC m=+217.661869190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.378573 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" podStartSLOduration=168.378553554 podStartE2EDuration="2m48.378553554s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.326631471 +0000 UTC m=+217.207657601" watchObservedRunningTime="2026-03-20 13:26:42.378553554 +0000 UTC m=+217.259579684" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.381108 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" podStartSLOduration=168.381099399 podStartE2EDuration="2m48.381099399s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.377904497 +0000 UTC m=+217.258930627" watchObservedRunningTime="2026-03-20 13:26:42.381099399 +0000 UTC m=+217.262125529" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.381913 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.382192 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.882177058 +0000 UTC m=+217.763203178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.424572 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" podStartSLOduration=168.424557491 podStartE2EDuration="2m48.424557491s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.413199265 +0000 UTC m=+217.294225395" watchObservedRunningTime="2026-03-20 13:26:42.424557491 +0000 UTC m=+217.305583621" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.427000 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.482915 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.483200 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:42.983188967 +0000 UTC m=+217.864215087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.532966 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.584843 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.585436 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.085419898 +0000 UTC m=+217.966446018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.632577 4856 ???:1] "http: TLS handshake error from 192.168.126.11:51190: no serving certificate available for the kubelet" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.665050 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" podStartSLOduration=168.665032551 podStartE2EDuration="2m48.665032551s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:42.5824052 +0000 UTC m=+217.463431350" watchObservedRunningTime="2026-03-20 13:26:42.665032551 +0000 UTC m=+217.546058681" Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.666614 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.666789 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerName="route-controller-manager" containerID="cri-o://cb6036a2537d7ec5f5b1b846c88631aca7aca2bcd16eb5d5ea8fba9ae1c9fe62" gracePeriod=30 Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.687091 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.687516 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.187481475 +0000 UTC m=+218.068507605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.788029 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.788432 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.288415863 +0000 UTC m=+218.169441993 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.891970 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.892638 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.392626366 +0000 UTC m=+218.273652496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:42 crc kubenswrapper[4856]: I0320 13:26:42.993775 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:42 crc kubenswrapper[4856]: E0320 13:26:42.994071 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.494054556 +0000 UTC m=+218.375080686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.058342 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.095432 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.095743 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.595732033 +0000 UTC m=+218.476758163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.104072 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-gc9r9" event={"ID":"5a807fae-fc6e-4c3e-9de7-a4d1f8b06090","Type":"ContainerStarted","Data":"60c5af9ae6909d950a37fad9f2d63803ea3cdfd377df95dd478da5402ee34841"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.113050 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.142539 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" event={"ID":"0086dcd3-2759-447b-907e-926a36e7a25d","Type":"ContainerStarted","Data":"a1ad14a1219958c69e58d1a71568e573a3448b17c42ea1b8f561b3effaf8043e"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.174140 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-7dbpg" event={"ID":"fa44c47f-e650-4056-9588-51fd98a96b99","Type":"ContainerStarted","Data":"2280d9b15bdc3f38a8729d40009676091b2fdf147a0a7009deff787203bad039"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.177496 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:43 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:43 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:43 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.177546 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.180958 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vkb56" event={"ID":"e5460cbb-96ab-496c-84cf-e85c25f68fcc","Type":"ContainerStarted","Data":"120a24a2bf3db62fa8bec9afc4383af4fa085bb5246db174e255521728fb2309"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.181037 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.187333 4856 generic.go:334] "Generic (PLEG): container finished" podID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerID="d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad" exitCode=0 Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.187670 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerDied","Data":"d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.187710 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerStarted","Data":"fc5ad811eac3972321c3b8fd197955d720900f8d14c74fbf6e28b860e35ac3b9"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.192941 4856 generic.go:334] "Generic (PLEG): container finished" podID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerID="cb6036a2537d7ec5f5b1b846c88631aca7aca2bcd16eb5d5ea8fba9ae1c9fe62" exitCode=0 Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.193109 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" event={"ID":"c3fe7864-7af6-43c6-ac77-bac39a84b3d7","Type":"ContainerDied","Data":"cb6036a2537d7ec5f5b1b846c88631aca7aca2bcd16eb5d5ea8fba9ae1c9fe62"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.201186 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.202229 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.702208385 +0000 UTC m=+218.583234515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.203114 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vkb56" podStartSLOduration=9.203096938 podStartE2EDuration="9.203096938s" podCreationTimestamp="2026-03-20 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:43.200720927 +0000 UTC m=+218.081747067" watchObservedRunningTime="2026-03-20 13:26:43.203096938 +0000 UTC m=+218.084123068" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.218673 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" event={"ID":"b5cb6a22-572a-47fb-978b-b091ab19f2d6","Type":"ContainerStarted","Data":"458bff76eaea7a9f9a5d9ff6f2d78ce5fafd0f8bf37090c28c692bce7f76d9bb"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.220745 4856 generic.go:334] "Generic (PLEG): container finished" podID="b99e422b-ccde-422a-869f-7898a008a66a" containerID="948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74" exitCode=0 Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.220806 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerDied","Data":"948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.220821 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerStarted","Data":"3e284f68faf306d2b6e6172d5e87043bed5407c84abc123ddb02cb2a70849ec5"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.242472 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" podUID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" containerName="controller-manager" containerID="cri-o://4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0" gracePeriod=30 Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.243784 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" event={"ID":"a15d3bc4-898a-48c7-a076-23e0911e635e","Type":"ContainerStarted","Data":"14b17cd216041f60810bebf1e277652204c8e1027c2f073128443e59d32a59b2"} Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.244995 4856 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pbqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.245049 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pbqq" podUID="ae4a83e0-1d82-4893-af26-ff6f5741b9a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.245090 4856 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-9fh88 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" start-of-body= Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.245149 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": dial tcp 10.217.0.37:8080: connect: connection refused" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.276256 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-6wxk9" podStartSLOduration=169.276240733 podStartE2EDuration="2m49.276240733s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:43.245755249 +0000 UTC m=+218.126781399" watchObservedRunningTime="2026-03-20 13:26:43.276240733 +0000 UTC m=+218.157266863" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.276825 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-xtxbv" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.290280 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.292993 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.298154 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.303424 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.303696 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.803684166 +0000 UTC m=+218.684710296 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.310552 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.415734 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.416073 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.417848 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.917826338 +0000 UTC m=+218.798852468 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.441099 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.441660 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngph6\" (UniqueName: \"kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.441817 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.463499 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:43.963471276 +0000 UTC m=+218.844497406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.515948 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.548724 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.548943 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.549046 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngph6\" (UniqueName: \"kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.549077 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.549546 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.549633 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.049614759 +0000 UTC m=+218.930640899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.549882 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.574477 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-vzmd5" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.650564 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config\") pod \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.650766 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca\") pod \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.650787 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert\") pod \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.650837 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnv7k\" (UniqueName: \"kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k\") pod \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\" (UID: \"c3fe7864-7af6-43c6-ac77-bac39a84b3d7\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.651025 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.651391 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.151380078 +0000 UTC m=+219.032406208 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.652102 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config" (OuterVolumeSpecName: "config") pod "c3fe7864-7af6-43c6-ac77-bac39a84b3d7" (UID: "c3fe7864-7af6-43c6-ac77-bac39a84b3d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.652491 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3fe7864-7af6-43c6-ac77-bac39a84b3d7" (UID: "c3fe7864-7af6-43c6-ac77-bac39a84b3d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.664421 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngph6\" (UniqueName: \"kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6\") pod \"redhat-marketplace-5w74v\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.664895 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k" (OuterVolumeSpecName: "kube-api-access-vnv7k") pod "c3fe7864-7af6-43c6-ac77-bac39a84b3d7" (UID: "c3fe7864-7af6-43c6-ac77-bac39a84b3d7"). InnerVolumeSpecName "kube-api-access-vnv7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.690751 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3fe7864-7af6-43c6-ac77-bac39a84b3d7" (UID: "c3fe7864-7af6-43c6-ac77-bac39a84b3d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.724541 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.724743 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerName="route-controller-manager" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.724756 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerName="route-controller-manager" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.724888 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" containerName="route-controller-manager" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.731768 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.746324 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.752115 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.752406 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.752419 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.752427 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.752436 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnv7k\" (UniqueName: \"kubernetes.io/projected/c3fe7864-7af6-43c6-ac77-bac39a84b3d7-kube-api-access-vnv7k\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.752856 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.252837409 +0000 UTC m=+219.133863539 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.793914 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.794720 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.816773 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.832580 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.836508 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.851413 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.853072 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6wj\" (UniqueName: \"kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.853123 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.853151 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.853194 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.853519 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.35350673 +0000 UTC m=+219.234532860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.954497 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.954972 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.955030 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6wj\" (UniqueName: \"kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.955054 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.955091 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.955134 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.955796 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:43 crc kubenswrapper[4856]: E0320 13:26:43.955864 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.455849014 +0000 UTC m=+219.336875144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:43 crc kubenswrapper[4856]: I0320 13:26:43.956075 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.000446 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6wj\" (UniqueName: \"kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj\") pod \"redhat-marketplace-7jsvf\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.008338 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059151 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert\") pod \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059194 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config\") pod \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059221 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54mql\" (UniqueName: \"kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql\") pod \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059246 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca\") pod \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059297 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles\") pod \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\" (UID: \"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059370 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059390 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.059414 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.059729 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.559716738 +0000 UTC m=+219.440742868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.063578 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" (UID: "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.064122 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config" (OuterVolumeSpecName: "config") pod "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" (UID: "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.065543 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.065869 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca" (OuterVolumeSpecName: "client-ca") pod "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" (UID: "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.065975 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" (UID: "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.079928 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql" (OuterVolumeSpecName: "kube-api-access-54mql") pod "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" (UID: "86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1"). InnerVolumeSpecName "kube-api-access-54mql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.083896 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.155306 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.163163 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.163991 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.164005 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.164014 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54mql\" (UniqueName: \"kubernetes.io/projected/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-kube-api-access-54mql\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.164023 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.164030 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.164097 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.664081225 +0000 UTC m=+219.545107355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.178134 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:44 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:44 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:44 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.178180 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.203052 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.271783 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.272140 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.772127037 +0000 UTC m=+219.653153167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.292710 4856 generic.go:334] "Generic (PLEG): container finished" podID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerID="85e76a5bdc9146299973b51fe02b98cd3d26f5f403a9956ddf0f68966bd945c9" exitCode=0 Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.293529 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerDied","Data":"85e76a5bdc9146299973b51fe02b98cd3d26f5f403a9956ddf0f68966bd945c9"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.293554 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerStarted","Data":"fd46369b715b54706d28676e0bdfc5e06b580929d479c801301c5dbbbb0cd0ba"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.294173 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.299063 4856 generic.go:334] "Generic (PLEG): container finished" podID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerID="eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a" exitCode=0 Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.299153 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerDied","Data":"eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.299170 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerStarted","Data":"41db8f10710e46e0522d6d9c97340884ab21c1b405a37bf95a4090e6fc7bc91e"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.305549 4856 generic.go:334] "Generic (PLEG): container finished" podID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" containerID="4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0" exitCode=0 Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.305638 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" event={"ID":"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1","Type":"ContainerDied","Data":"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.305668 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" event={"ID":"86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1","Type":"ContainerDied","Data":"20b42097747e75cc5d98d920a1b9a963ea9966124eec14ccc52ae6e7970547c3"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.305684 4856 scope.go:117] "RemoveContainer" containerID="4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.305802 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-dtwhv" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.351856 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.352091 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7" event={"ID":"c3fe7864-7af6-43c6-ac77-bac39a84b3d7","Type":"ContainerDied","Data":"356bd680ed1ed50c23b5c9e5a7f80288d2d94fcce75e6999baf8d966464af4e9"} Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.380662 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.381592 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.881577206 +0000 UTC m=+219.762603336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.413142 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.413185 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-n8vk7"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.424805 4856 scope.go:117] "RemoveContainer" containerID="4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.424903 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.425362 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0\": container with ID starting with 4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0 not found: ID does not exist" containerID="4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.425386 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0"} err="failed to get container status \"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0\": rpc error: code = NotFound desc = could not find container \"4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0\": container with ID starting with 4e731d3191188e8a068c87543328b1667803a1f6097cc73490f80b1aefb6f3e0 not found: ID does not exist" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.425406 4856 scope.go:117] "RemoveContainer" containerID="cb6036a2537d7ec5f5b1b846c88631aca7aca2bcd16eb5d5ea8fba9ae1c9fe62" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.459323 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-dtwhv"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.489198 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.489953 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:44.989941268 +0000 UTC m=+219.870967398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.590206 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.590782 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.090765343 +0000 UTC m=+219.971791473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.669088 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.669431 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" containerName="controller-manager" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.669452 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" containerName="controller-manager" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.669548 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" containerName="controller-manager" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.671231 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.672701 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.672781 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.675639 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.692048 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.692078 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmsmq\" (UniqueName: \"kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.692112 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.692148 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.692473 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.192461439 +0000 UTC m=+220.073487569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: W0320 13:26:44.705171 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2ecb9fda_fb28_4f6e_a7a8_da20c7755351.slice/crio-f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b WatchSource:0}: Error finding container f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b: Status 404 returned error can't find the container with id f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.768591 4856 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.785542 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.793183 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.793379 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.293358266 +0000 UTC m=+220.174384396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.793462 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmsmq\" (UniqueName: \"kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.793491 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.793552 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.794001 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.293983923 +0000 UTC m=+220.175010053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.795343 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.795402 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.795779 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: W0320 13:26:44.809329 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e32090f_521c_4585_884b_650644c11aee.slice/crio-828efe73d78d1217a9adf108b65ddc42a4957c70b2d547b3a62f084d5541de6e WatchSource:0}: Error finding container 828efe73d78d1217a9adf108b65ddc42a4957c70b2d547b3a62f084d5541de6e: Status 404 returned error can't find the container with id 828efe73d78d1217a9adf108b65ddc42a4957c70b2d547b3a62f084d5541de6e Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.817353 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmsmq\" (UniqueName: \"kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq\") pod \"redhat-operators-tgq7q\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.897129 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.897306 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.397281691 +0000 UTC m=+220.278307821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.897622 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:44 crc kubenswrapper[4856]: E0320 13:26:44.897925 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.397909578 +0000 UTC m=+220.278935708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.946590 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.947421 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.950513 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.950709 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.950814 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.950891 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.950940 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.951072 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.961431 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:26:44 crc kubenswrapper[4856]: I0320 13:26:44.991140 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:44.999723 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:45 crc kubenswrapper[4856]: E0320 13:26:44.999980 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.499957265 +0000 UTC m=+220.380983395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.000014 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.000062 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.000238 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.000338 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlxzb\" (UniqueName: \"kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.000413 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: E0320 13:26:45.000683 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.500675743 +0000 UTC m=+220.381701883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.058750 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.060462 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.070430 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:45 crc kubenswrapper[4856]: E0320 13:26:45.102342 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.602260058 +0000 UTC m=+220.483286188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102388 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102422 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102461 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlxzb\" (UniqueName: \"kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102499 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjtx\" (UniqueName: \"kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102523 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102545 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102562 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.102593 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: E0320 13:26:45.102840 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.602827222 +0000 UTC m=+220.483853352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-84k9j" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.103552 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.105235 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.117848 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.125889 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlxzb\" (UniqueName: \"kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb\") pod \"route-controller-manager-566c496b48-t6ds6\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.175562 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:45 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:45 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:45 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.175622 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.208715 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.209288 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.209330 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjtx\" (UniqueName: \"kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.209353 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.209689 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: E0320 13:26:45.209751 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:26:45.709737076 +0000 UTC m=+220.590763206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.209931 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.238860 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjtx\" (UniqueName: \"kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx\") pod \"redhat-operators-62lq9\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.249703 4856 ???:1] "http: TLS handshake error from 192.168.126.11:48676: no serving certificate available for the kubelet" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.267914 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.281209 4856 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:26:44.768621913Z","Handler":null,"Name":""} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.284516 4856 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.284548 4856 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.311738 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.324339 4856 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.324439 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.357481 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-84k9j\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.365029 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e32090f-521c-4585-884b-650644c11aee" containerID="50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49" exitCode=0 Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.365111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerDied","Data":"50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.365144 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerStarted","Data":"828efe73d78d1217a9adf108b65ddc42a4957c70b2d547b3a62f084d5541de6e"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.384954 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" event={"ID":"0086dcd3-2759-447b-907e-926a36e7a25d","Type":"ContainerStarted","Data":"4484116d583a6590a647e2f0380f556f0a97b2341a8e9d98ac17089fca1e8695"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.385010 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" event={"ID":"0086dcd3-2759-447b-907e-926a36e7a25d","Type":"ContainerStarted","Data":"7daed722d1f9c74deac6eb83030a491024bafd6db48d60dd5fb1eb2d176f9b44"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.394359 4856 generic.go:334] "Generic (PLEG): container finished" podID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerID="14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b" exitCode=0 Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.394453 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerDied","Data":"14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.394493 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerStarted","Data":"b62978e2a3aa477453a0bf551139c00b031ccfbd07b55ceb4eedc09f052de8df"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.402389 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.412032 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2ecb9fda-fb28-4f6e-a7a8-da20c7755351","Type":"ContainerStarted","Data":"3ca61aeb5ff92f9eefb20e69b9bb521b472e537869f51a4bb7c7aae90b8ac99a"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.412417 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2ecb9fda-fb28-4f6e-a7a8-da20c7755351","Type":"ContainerStarted","Data":"f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b"} Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.417595 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.436813 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.444128 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.444112247 podStartE2EDuration="2.444112247s" podCreationTimestamp="2026-03-20 13:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:45.44076997 +0000 UTC m=+220.321796100" watchObservedRunningTime="2026-03-20 13:26:45.444112247 +0000 UTC m=+220.325138377" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.523112 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.563683 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.686432 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.838563 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1" path="/var/lib/kubelet/pods/86d2b40c-6b3f-4d1d-bd13-2424fcbe5fe1/volumes" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.839649 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.840174 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3fe7864-7af6-43c6-ac77-bac39a84b3d7" path="/var/lib/kubelet/pods/c3fe7864-7af6-43c6-ac77-bac39a84b3d7/volumes" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.842027 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.938422 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.939381 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.942181 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.942370 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.942894 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.946700 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.946862 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.947006 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.948679 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:26:45 crc kubenswrapper[4856]: I0320 13:26:45.961447 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.029660 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.029701 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.029763 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnpgm\" (UniqueName: \"kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.029782 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.029810 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.128536 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.130596 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnpgm\" (UniqueName: \"kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.131567 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.131608 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.131640 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.131661 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.133502 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.134743 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.135814 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.140615 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.156347 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnpgm\" (UniqueName: \"kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm\") pod \"controller-manager-74dc68f858-crlhb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: W0320 13:26:46.161433 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a7156e_6f00_4f4a_98c8_9f592406eea3.slice/crio-8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce WatchSource:0}: Error finding container 8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce: Status 404 returned error can't find the container with id 8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.176012 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:46 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:46 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:46 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.176046 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.265691 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.475177 4856 generic.go:334] "Generic (PLEG): container finished" podID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerID="1446ec6efacaafe17866ebdba53fd523321a51c1eeaad0db53ee215eb620b072" exitCode=0 Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.475956 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerDied","Data":"1446ec6efacaafe17866ebdba53fd523321a51c1eeaad0db53ee215eb620b072"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.476013 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerStarted","Data":"8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.491823 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.491912 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.511200 4856 patch_prober.go:28] interesting pod/console-f9d7485db-jwjhv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.511344 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jwjhv" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.513659 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" event={"ID":"0086dcd3-2759-447b-907e-926a36e7a25d","Type":"ContainerStarted","Data":"dccfa539cfc84dcda704433fcafbdc6b4e249fadd5a315992ae0379d722aeedb"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.519155 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" event={"ID":"479327d7-e582-4367-9f68-2f65ce5c3dfe","Type":"ContainerStarted","Data":"9c4c5b4a510f6f36ac10dd0d63c0647257078705d3165a5cc74f0a1a8a5c7f9d"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.519304 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.519318 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" event={"ID":"479327d7-e582-4367-9f68-2f65ce5c3dfe","Type":"ContainerStarted","Data":"17979ceed6e61f396d185701ba7d40bf177005729a4250e7d3b863df2a47284e"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.527005 4856 generic.go:334] "Generic (PLEG): container finished" podID="2ecb9fda-fb28-4f6e-a7a8-da20c7755351" containerID="3ca61aeb5ff92f9eefb20e69b9bb521b472e537869f51a4bb7c7aae90b8ac99a" exitCode=0 Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.527080 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2ecb9fda-fb28-4f6e-a7a8-da20c7755351","Type":"ContainerDied","Data":"3ca61aeb5ff92f9eefb20e69b9bb521b472e537869f51a4bb7c7aae90b8ac99a"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.548844 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.549834 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" event={"ID":"a44a9d76-3f44-4824-89ed-5e9cbce7577e","Type":"ContainerStarted","Data":"b7f1f3c60f47c1a904245b1d3947c13ba4aa635d03bbf329e9a5ed460a99b527"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.549883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" event={"ID":"a44a9d76-3f44-4824-89ed-5e9cbce7577e","Type":"ContainerStarted","Data":"f4a65d0a738cd2d7065faa44b8ef239e1a7df3c54109e77b8afe66edebfccabb"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.551105 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.558584 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-7wsbq" podStartSLOduration=12.558487427 podStartE2EDuration="12.558487427s" podCreationTimestamp="2026-03-20 13:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:46.552567353 +0000 UTC m=+221.433593503" watchObservedRunningTime="2026-03-20 13:26:46.558487427 +0000 UTC m=+221.439513557" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.563073 4856 generic.go:334] "Generic (PLEG): container finished" podID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerID="e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf" exitCode=0 Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.563147 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerDied","Data":"e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.563188 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerStarted","Data":"c0797837505fd4dfc26f2ca3af266ba77614a217b7b1dbd7ba703cf6d2d626f9"} Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.566171 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.589880 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" podStartSLOduration=4.589860753 podStartE2EDuration="4.589860753s" podCreationTimestamp="2026-03-20 13:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:46.583008234 +0000 UTC m=+221.464034374" watchObservedRunningTime="2026-03-20 13:26:46.589860753 +0000 UTC m=+221.470886873" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.590403 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.590467 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.603616 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.603823 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.638233 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.641906 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:46 crc kubenswrapper[4856]: I0320 13:26:46.657097 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" podStartSLOduration=172.657075403 podStartE2EDuration="2m52.657075403s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:46.631028255 +0000 UTC m=+221.512054385" watchObservedRunningTime="2026-03-20 13:26:46.657075403 +0000 UTC m=+221.538101533" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.100336 4856 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pbqq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.100594 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-9pbqq" podUID="ae4a83e0-1d82-4893-af26-ff6f5741b9a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.100434 4856 patch_prober.go:28] interesting pod/downloads-7954f5f757-9pbqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.100683 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-9pbqq" podUID="ae4a83e0-1d82-4893-af26-ff6f5741b9a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.25:8080/\": dial tcp 10.217.0.25:8080: connect: connection refused" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.169878 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.174486 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:47 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:47 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:47 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.174635 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.232258 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.577611 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" event={"ID":"6570e719-fd85-4925-8fa6-b8e1d2dc50eb","Type":"ContainerStarted","Data":"ac246fe472d4a1af0990c6382ad7d8b146a58d3584a18c343d04a14731353155"} Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.577650 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" event={"ID":"6570e719-fd85-4925-8fa6-b8e1d2dc50eb","Type":"ContainerStarted","Data":"51df95b837a7df57fdfb464e6e64270a8234223df4d71a0fcca9bdca9f9fcd6d"} Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.581058 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.585538 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fqq88" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.586017 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hqths" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.587917 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:26:47 crc kubenswrapper[4856]: I0320 13:26:47.598323 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" podStartSLOduration=5.598307866 podStartE2EDuration="5.598307866s" podCreationTimestamp="2026-03-20 13:26:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:47.595299416 +0000 UTC m=+222.476325556" watchObservedRunningTime="2026-03-20 13:26:47.598307866 +0000 UTC m=+222.479334006" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.087166 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.175766 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:48 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:48 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:48 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.175834 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.201786 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access\") pod \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.201915 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir\") pod \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\" (UID: \"2ecb9fda-fb28-4f6e-a7a8-da20c7755351\") " Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.202176 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2ecb9fda-fb28-4f6e-a7a8-da20c7755351" (UID: "2ecb9fda-fb28-4f6e-a7a8-da20c7755351"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.227376 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2ecb9fda-fb28-4f6e-a7a8-da20c7755351" (UID: "2ecb9fda-fb28-4f6e-a7a8-da20c7755351"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.303591 4856 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.303632 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ecb9fda-fb28-4f6e-a7a8-da20c7755351-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.605242 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"2ecb9fda-fb28-4f6e-a7a8-da20c7755351","Type":"ContainerDied","Data":"f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b"} Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.605291 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f28dd003e890143414bb229cfd8cd51a9e2755f6836772bf4dc03e3b0818043b" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.605358 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.615102 4856 generic.go:334] "Generic (PLEG): container finished" podID="1f0acb35-a304-43e3-8306-5c5319d0e8e8" containerID="2f1a52b951c5480ea3fd08fab44d8c11ef8195ee8461d4246c8df67127bf2522" exitCode=0 Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.615127 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" event={"ID":"1f0acb35-a304-43e3-8306-5c5319d0e8e8","Type":"ContainerDied","Data":"2f1a52b951c5480ea3fd08fab44d8c11ef8195ee8461d4246c8df67127bf2522"} Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.867045 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:26:48 crc kubenswrapper[4856]: E0320 13:26:48.867242 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ecb9fda-fb28-4f6e-a7a8-da20c7755351" containerName="pruner" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.867252 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ecb9fda-fb28-4f6e-a7a8-da20c7755351" containerName="pruner" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.867368 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ecb9fda-fb28-4f6e-a7a8-da20c7755351" containerName="pruner" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.867755 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.883829 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.884530 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4856]: I0320 13:26:48.896860 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.014615 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.014670 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.115743 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.115815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.115940 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.136811 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.172104 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:49 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:49 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:49 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.172155 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.207771 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.349798 4856 ???:1] "http: TLS handshake error from 192.168.126.11:48692: no serving certificate available for the kubelet" Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.509834 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.631894 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37730f1d-a130-4c64-bae5-acbb66f16eff","Type":"ContainerStarted","Data":"34d72924d85b448e9685fc700db320fad8de36a47279b89cb2a77599d94bfb67"} Mar 20 13:26:49 crc kubenswrapper[4856]: I0320 13:26:49.955711 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.029260 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume\") pod \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.029334 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xglz7\" (UniqueName: \"kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7\") pod \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.029397 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume\") pod \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\" (UID: \"1f0acb35-a304-43e3-8306-5c5319d0e8e8\") " Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.030749 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume" (OuterVolumeSpecName: "config-volume") pod "1f0acb35-a304-43e3-8306-5c5319d0e8e8" (UID: "1f0acb35-a304-43e3-8306-5c5319d0e8e8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.035357 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7" (OuterVolumeSpecName: "kube-api-access-xglz7") pod "1f0acb35-a304-43e3-8306-5c5319d0e8e8" (UID: "1f0acb35-a304-43e3-8306-5c5319d0e8e8"). InnerVolumeSpecName "kube-api-access-xglz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.035887 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1f0acb35-a304-43e3-8306-5c5319d0e8e8" (UID: "1f0acb35-a304-43e3-8306-5c5319d0e8e8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.131181 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1f0acb35-a304-43e3-8306-5c5319d0e8e8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.131211 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xglz7\" (UniqueName: \"kubernetes.io/projected/1f0acb35-a304-43e3-8306-5c5319d0e8e8-kube-api-access-xglz7\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.131224 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1f0acb35-a304-43e3-8306-5c5319d0e8e8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.179496 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:50 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:50 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:50 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.179839 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.392860 4856 ???:1] "http: TLS handshake error from 192.168.126.11:48700: no serving certificate available for the kubelet" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.668575 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" event={"ID":"1f0acb35-a304-43e3-8306-5c5319d0e8e8","Type":"ContainerDied","Data":"4dda566d1e6539831985cb538e5f574f195af6d8ef80e1a2eba827a80b900bb4"} Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.668611 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dda566d1e6539831985cb538e5f574f195af6d8ef80e1a2eba827a80b900bb4" Mar 20 13:26:50 crc kubenswrapper[4856]: I0320 13:26:50.668711 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh" Mar 20 13:26:51 crc kubenswrapper[4856]: I0320 13:26:51.174009 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:51 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:51 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:51 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:51 crc kubenswrapper[4856]: I0320 13:26:51.174120 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:51 crc kubenswrapper[4856]: I0320 13:26:51.690152 4856 generic.go:334] "Generic (PLEG): container finished" podID="37730f1d-a130-4c64-bae5-acbb66f16eff" containerID="63a763f21ba944b478049ea6cd24e5b8a0f466c52cd2c65b40f9022c01834d66" exitCode=0 Mar 20 13:26:51 crc kubenswrapper[4856]: I0320 13:26:51.690199 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37730f1d-a130-4c64-bae5-acbb66f16eff","Type":"ContainerDied","Data":"63a763f21ba944b478049ea6cd24e5b8a0f466c52cd2c65b40f9022c01834d66"} Mar 20 13:26:52 crc kubenswrapper[4856]: I0320 13:26:52.173316 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:52 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:52 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:52 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:52 crc kubenswrapper[4856]: I0320 13:26:52.173386 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:52 crc kubenswrapper[4856]: I0320 13:26:52.597972 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vkb56" Mar 20 13:26:53 crc kubenswrapper[4856]: I0320 13:26:53.172471 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:53 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:53 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:53 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:53 crc kubenswrapper[4856]: I0320 13:26:53.172724 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:54 crc kubenswrapper[4856]: I0320 13:26:54.174109 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:54 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:54 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:54 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:54 crc kubenswrapper[4856]: I0320 13:26:54.174479 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:55 crc kubenswrapper[4856]: I0320 13:26:55.172894 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:55 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:55 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:55 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:55 crc kubenswrapper[4856]: I0320 13:26:55.172949 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:56 crc kubenswrapper[4856]: I0320 13:26:56.172576 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:56 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:56 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:56 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:56 crc kubenswrapper[4856]: I0320 13:26:56.172865 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:56 crc kubenswrapper[4856]: I0320 13:26:56.500131 4856 patch_prober.go:28] interesting pod/console-f9d7485db-jwjhv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 20 13:26:56 crc kubenswrapper[4856]: I0320 13:26:56.500196 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jwjhv" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.257552 4856 patch_prober.go:28] interesting pod/router-default-5444994796-pmhrr container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:26:58 crc kubenswrapper[4856]: [-]has-synced failed: reason withheld Mar 20 13:26:58 crc kubenswrapper[4856]: [+]process-running ok Mar 20 13:26:58 crc kubenswrapper[4856]: healthz check failed Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.257970 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pmhrr" podUID="f8c071e4-4f55-46c4-944f-05ba67dec8dd" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.293108 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-9pbqq" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.622371 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.744762 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access\") pod \"37730f1d-a130-4c64-bae5-acbb66f16eff\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.744829 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir\") pod \"37730f1d-a130-4c64-bae5-acbb66f16eff\" (UID: \"37730f1d-a130-4c64-bae5-acbb66f16eff\") " Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.744981 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "37730f1d-a130-4c64-bae5-acbb66f16eff" (UID: "37730f1d-a130-4c64-bae5-acbb66f16eff"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.751035 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "37730f1d-a130-4c64-bae5-acbb66f16eff" (UID: "37730f1d-a130-4c64-bae5-acbb66f16eff"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.846345 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37730f1d-a130-4c64-bae5-acbb66f16eff-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:58 crc kubenswrapper[4856]: I0320 13:26:58.846375 4856 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/37730f1d-a130-4c64-bae5-acbb66f16eff-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:59 crc kubenswrapper[4856]: I0320 13:26:59.173307 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:59 crc kubenswrapper[4856]: I0320 13:26:59.176546 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pmhrr" Mar 20 13:26:59 crc kubenswrapper[4856]: I0320 13:26:59.293094 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:26:59 crc kubenswrapper[4856]: I0320 13:26:59.293431 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"37730f1d-a130-4c64-bae5-acbb66f16eff","Type":"ContainerDied","Data":"34d72924d85b448e9685fc700db320fad8de36a47279b89cb2a77599d94bfb67"} Mar 20 13:26:59 crc kubenswrapper[4856]: I0320 13:26:59.293463 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34d72924d85b448e9685fc700db320fad8de36a47279b89cb2a77599d94bfb67" Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.494609 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.515511 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca-metrics-certs\") pod \"network-metrics-daemon-qtlvp\" (UID: \"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca\") " pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.662109 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qtlvp" Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.976852 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.977128 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerName="controller-manager" containerID="cri-o://ac246fe472d4a1af0990c6382ad7d8b146a58d3584a18c343d04a14731353155" gracePeriod=30 Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.982333 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:27:01 crc kubenswrapper[4856]: I0320 13:27:01.982562 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerName="route-controller-manager" containerID="cri-o://b7f1f3c60f47c1a904245b1d3947c13ba4aa635d03bbf329e9a5ed460a99b527" gracePeriod=30 Mar 20 13:27:05 crc kubenswrapper[4856]: I0320 13:27:05.268718 4856 patch_prober.go:28] interesting pod/route-controller-manager-566c496b48-t6ds6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 13:27:05 crc kubenswrapper[4856]: I0320 13:27:05.269059 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 13:27:05 crc kubenswrapper[4856]: I0320 13:27:05.533996 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:27:06 crc kubenswrapper[4856]: I0320 13:27:06.627794 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:27:06 crc kubenswrapper[4856]: I0320 13:27:06.640126 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:27:07 crc kubenswrapper[4856]: I0320 13:27:07.266781 4856 patch_prober.go:28] interesting pod/controller-manager-74dc68f858-crlhb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:27:07 crc kubenswrapper[4856]: I0320 13:27:07.267262 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.347380 4856 generic.go:334] "Generic (PLEG): container finished" podID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerID="ac246fe472d4a1af0990c6382ad7d8b146a58d3584a18c343d04a14731353155" exitCode=0 Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.347519 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" event={"ID":"6570e719-fd85-4925-8fa6-b8e1d2dc50eb","Type":"ContainerDied","Data":"ac246fe472d4a1af0990c6382ad7d8b146a58d3584a18c343d04a14731353155"} Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.350217 4856 generic.go:334] "Generic (PLEG): container finished" podID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerID="b7f1f3c60f47c1a904245b1d3947c13ba4aa635d03bbf329e9a5ed460a99b527" exitCode=0 Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.350280 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" event={"ID":"a44a9d76-3f44-4824-89ed-5e9cbce7577e","Type":"ContainerDied","Data":"b7f1f3c60f47c1a904245b1d3947c13ba4aa635d03bbf329e9a5ed460a99b527"} Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.988836 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:27:09 crc kubenswrapper[4856]: I0320 13:27:09.988940 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:27:10 crc kubenswrapper[4856]: I0320 13:27:10.895517 4856 ???:1] "http: TLS handshake error from 192.168.126.11:34790: no serving certificate available for the kubelet" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.612874 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.619073 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.649611 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.650075 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerName="route-controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.650094 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerName="route-controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.650135 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37730f1d-a130-4c64-bae5-acbb66f16eff" containerName="pruner" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.650143 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="37730f1d-a130-4c64-bae5-acbb66f16eff" containerName="pruner" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.650157 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0acb35-a304-43e3-8306-5c5319d0e8e8" containerName="collect-profiles" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.650165 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0acb35-a304-43e3-8306-5c5319d0e8e8" containerName="collect-profiles" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.650175 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerName="controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.650182 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerName="controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.650406 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="37730f1d-a130-4c64-bae5-acbb66f16eff" containerName="pruner" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.651350 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" containerName="route-controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.651363 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" containerName="controller-manager" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.651375 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0acb35-a304-43e3-8306-5c5319d0e8e8" containerName="collect-profiles" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.650864 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.651985 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.653135 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5w6wj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-7jsvf_openshift-marketplace(3e32090f-521c-4585-884b-650644c11aee): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:27:14 crc kubenswrapper[4856]: E0320 13:27:14.657470 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-7jsvf" podUID="3e32090f-521c-4585-884b-650644c11aee" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.670409 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.670458 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.670505 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lbpv\" (UniqueName: \"kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.670597 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.671305 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771653 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert\") pod \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771725 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles\") pod \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771767 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnpgm\" (UniqueName: \"kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm\") pod \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771793 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlxzb\" (UniqueName: \"kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb\") pod \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771823 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config\") pod \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771846 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert\") pod \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771868 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca\") pod \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\" (UID: \"a44a9d76-3f44-4824-89ed-5e9cbce7577e\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771895 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca\") pod \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.771918 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config\") pod \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\" (UID: \"6570e719-fd85-4925-8fa6-b8e1d2dc50eb\") " Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.772047 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lbpv\" (UniqueName: \"kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.772112 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.772162 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.772185 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.773471 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "6570e719-fd85-4925-8fa6-b8e1d2dc50eb" (UID: "6570e719-fd85-4925-8fa6-b8e1d2dc50eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.773931 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca" (OuterVolumeSpecName: "client-ca") pod "a44a9d76-3f44-4824-89ed-5e9cbce7577e" (UID: "a44a9d76-3f44-4824-89ed-5e9cbce7577e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.774074 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6570e719-fd85-4925-8fa6-b8e1d2dc50eb" (UID: "6570e719-fd85-4925-8fa6-b8e1d2dc50eb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.774097 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config" (OuterVolumeSpecName: "config") pod "a44a9d76-3f44-4824-89ed-5e9cbce7577e" (UID: "a44a9d76-3f44-4824-89ed-5e9cbce7577e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.774984 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.775222 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config" (OuterVolumeSpecName: "config") pod "6570e719-fd85-4925-8fa6-b8e1d2dc50eb" (UID: "6570e719-fd85-4925-8fa6-b8e1d2dc50eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.776416 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb" (OuterVolumeSpecName: "kube-api-access-vlxzb") pod "a44a9d76-3f44-4824-89ed-5e9cbce7577e" (UID: "a44a9d76-3f44-4824-89ed-5e9cbce7577e"). InnerVolumeSpecName "kube-api-access-vlxzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.776514 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm" (OuterVolumeSpecName: "kube-api-access-jnpgm") pod "6570e719-fd85-4925-8fa6-b8e1d2dc50eb" (UID: "6570e719-fd85-4925-8fa6-b8e1d2dc50eb"). InnerVolumeSpecName "kube-api-access-jnpgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.776995 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.777005 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6570e719-fd85-4925-8fa6-b8e1d2dc50eb" (UID: "6570e719-fd85-4925-8fa6-b8e1d2dc50eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.777249 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a44a9d76-3f44-4824-89ed-5e9cbce7577e" (UID: "a44a9d76-3f44-4824-89ed-5e9cbce7577e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.785585 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.789184 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lbpv\" (UniqueName: \"kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv\") pod \"route-controller-manager-7b5cf75d59-hrb79\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873859 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873887 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873910 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873922 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnpgm\" (UniqueName: \"kubernetes.io/projected/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-kube-api-access-jnpgm\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873939 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlxzb\" (UniqueName: \"kubernetes.io/projected/a44a9d76-3f44-4824-89ed-5e9cbce7577e-kube-api-access-vlxzb\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873947 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873955 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a44a9d76-3f44-4824-89ed-5e9cbce7577e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873963 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a44a9d76-3f44-4824-89ed-5e9cbce7577e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.873988 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6570e719-fd85-4925-8fa6-b8e1d2dc50eb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:14 crc kubenswrapper[4856]: I0320 13:27:14.977517 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.384068 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" event={"ID":"6570e719-fd85-4925-8fa6-b8e1d2dc50eb","Type":"ContainerDied","Data":"51df95b837a7df57fdfb464e6e64270a8234223df4d71a0fcca9bdca9f9fcd6d"} Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.384382 4856 scope.go:117] "RemoveContainer" containerID="ac246fe472d4a1af0990c6382ad7d8b146a58d3584a18c343d04a14731353155" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.384112 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74dc68f858-crlhb" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.386103 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.386168 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6" event={"ID":"a44a9d76-3f44-4824-89ed-5e9cbce7577e","Type":"ContainerDied","Data":"f4a65d0a738cd2d7065faa44b8ef239e1a7df3c54109e77b8afe66edebfccabb"} Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.444296 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.453106 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-566c496b48-t6ds6"] Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.455747 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.458225 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74dc68f858-crlhb"] Mar 20 13:27:15 crc kubenswrapper[4856]: E0320 13:27:15.552730 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-7jsvf" podUID="3e32090f-521c-4585-884b-650644c11aee" Mar 20 13:27:15 crc kubenswrapper[4856]: E0320 13:27:15.589450 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:27:15 crc kubenswrapper[4856]: E0320 13:27:15.589593 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:27:15 crc kubenswrapper[4856]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:27:15 crc kubenswrapper[4856]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-db9nr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566886-xkbwc_openshift-infra(46f98403-30f8-40f6-afa6-6defe5937024): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:27:15 crc kubenswrapper[4856]: > logger="UnhandledError" Mar 20 13:27:15 crc kubenswrapper[4856]: E0320 13:27:15.591249 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" podUID="46f98403-30f8-40f6-afa6-6defe5937024" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.826906 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6570e719-fd85-4925-8fa6-b8e1d2dc50eb" path="/var/lib/kubelet/pods/6570e719-fd85-4925-8fa6-b8e1d2dc50eb/volumes" Mar 20 13:27:15 crc kubenswrapper[4856]: I0320 13:27:15.827630 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a44a9d76-3f44-4824-89ed-5e9cbce7577e" path="/var/lib/kubelet/pods/a44a9d76-3f44-4824-89ed-5e9cbce7577e/volumes" Mar 20 13:27:16 crc kubenswrapper[4856]: E0320 13:27:16.393659 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" podUID="46f98403-30f8-40f6-afa6-6defe5937024" Mar 20 13:27:17 crc kubenswrapper[4856]: E0320 13:27:17.025265 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:27:17 crc kubenswrapper[4856]: E0320 13:27:17.025568 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fdvkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m6lhf_openshift-marketplace(1433d94f-1c65-49ca-a1ed-cb24d864eb55): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:27:17 crc kubenswrapper[4856]: E0320 13:27:17.026742 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m6lhf" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.206259 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bf2lq" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.247102 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.248162 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.250362 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.250782 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.250901 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.251161 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.251285 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.253203 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.256989 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.258316 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.329838 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.329903 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.330078 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.330157 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.330316 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ks6k\" (UniqueName: \"kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.362617 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.431124 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.431180 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.431212 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ks6k\" (UniqueName: \"kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.431249 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.431322 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.432494 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.434290 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.434630 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.445608 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.456173 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ks6k\" (UniqueName: \"kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k\") pod \"controller-manager-7c8565cd6-cm4dl\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:17 crc kubenswrapper[4856]: I0320 13:27:17.571777 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.867840 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.868816 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.871041 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.871329 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.871759 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.959541 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:19 crc kubenswrapper[4856]: I0320 13:27:19.959607 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.060718 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.060823 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.060890 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.079053 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.184866 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.454540 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m6lhf" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.565547 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.565680 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lk5fw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-w59xx_openshift-marketplace(b99e422b-ccde-422a-869f-7898a008a66a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.566911 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-w59xx" podUID="b99e422b-ccde-422a-869f-7898a008a66a" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.568598 4856 scope.go:117] "RemoveContainer" containerID="b7f1f3c60f47c1a904245b1d3947c13ba4aa635d03bbf329e9a5ed460a99b527" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.584341 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.584468 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkjtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-62lq9_openshift-marketplace(54a7156e-6f00-4f4a-98c8-9f592406eea3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.586830 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-62lq9" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.604575 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.604809 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jmsmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-tgq7q_openshift-marketplace(9e31609a-8b57-4cae-a4a7-cfe4a24e346b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:27:20 crc kubenswrapper[4856]: E0320 13:27:20.605987 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-tgq7q" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.878460 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:20 crc kubenswrapper[4856]: W0320 13:27:20.884412 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ed0de4_eed8_4b46_857d_61938ae2d9d4.slice/crio-94daa85dfba3f9fc3fd973ff3d1b738a7d757cdcd8c9a6a6c9f63d368bb46ec2 WatchSource:0}: Error finding container 94daa85dfba3f9fc3fd973ff3d1b738a7d757cdcd8c9a6a6c9f63d368bb46ec2: Status 404 returned error can't find the container with id 94daa85dfba3f9fc3fd973ff3d1b738a7d757cdcd8c9a6a6c9f63d368bb46ec2 Mar 20 13:27:20 crc kubenswrapper[4856]: I0320 13:27:20.968414 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qtlvp"] Mar 20 13:27:20 crc kubenswrapper[4856]: W0320 13:27:20.993979 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ab45c47_75d7_4d3f_b56c_45cc5ccbbfca.slice/crio-3e3df7420cb5e9c07b0e58e14055fff4be8bffbd152c927790fe9ce262144532 WatchSource:0}: Error finding container 3e3df7420cb5e9c07b0e58e14055fff4be8bffbd152c927790fe9ce262144532: Status 404 returned error can't find the container with id 3e3df7420cb5e9c07b0e58e14055fff4be8bffbd152c927790fe9ce262144532 Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.055061 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.058245 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:21 crc kubenswrapper[4856]: W0320 13:27:21.089989 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d7126d8_9ade_4ab3_8563_2e18034bab6d.slice/crio-6ad46335c07469daee74e46ab25bd50f42453fc43740af014c14959361c98e21 WatchSource:0}: Error finding container 6ad46335c07469daee74e46ab25bd50f42453fc43740af014c14959361c98e21: Status 404 returned error can't find the container with id 6ad46335c07469daee74e46ab25bd50f42453fc43740af014c14959361c98e21 Mar 20 13:27:21 crc kubenswrapper[4856]: W0320 13:27:21.090805 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod02e21f2d_7c3e_489e_a49c_c800d328a1cd.slice/crio-490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075 WatchSource:0}: Error finding container 490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075: Status 404 returned error can't find the container with id 490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075 Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.461449 4856 generic.go:334] "Generic (PLEG): container finished" podID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerID="643eb7804d77f9b94f9e8a6b223c2071ee1af944d4bbbf31e9ee41ddafba9a15" exitCode=0 Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.461621 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerDied","Data":"643eb7804d77f9b94f9e8a6b223c2071ee1af944d4bbbf31e9ee41ddafba9a15"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.477750 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" event={"ID":"9d7126d8-9ade-4ab3-8563-2e18034bab6d","Type":"ContainerStarted","Data":"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.477793 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" event={"ID":"9d7126d8-9ade-4ab3-8563-2e18034bab6d","Type":"ContainerStarted","Data":"6ad46335c07469daee74e46ab25bd50f42453fc43740af014c14959361c98e21"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.478228 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.483810 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" event={"ID":"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca","Type":"ContainerStarted","Data":"161f671e0fd02ed6e95c8a93a766080bf39f3400df404ac9ca947dea8e1f2022"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.483849 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" event={"ID":"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca","Type":"ContainerStarted","Data":"3e3df7420cb5e9c07b0e58e14055fff4be8bffbd152c927790fe9ce262144532"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.496723 4856 generic.go:334] "Generic (PLEG): container finished" podID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerID="4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078" exitCode=0 Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.496994 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerDied","Data":"4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.504051 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.515048 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" podStartSLOduration=20.515032275 podStartE2EDuration="20.515032275s" podCreationTimestamp="2026-03-20 13:27:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:21.513609918 +0000 UTC m=+256.394636048" watchObservedRunningTime="2026-03-20 13:27:21.515032275 +0000 UTC m=+256.396058405" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.523424 4856 generic.go:334] "Generic (PLEG): container finished" podID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerID="998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd" exitCode=0 Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.523489 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerDied","Data":"998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.534485 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" event={"ID":"e9ed0de4-eed8-4b46-857d-61938ae2d9d4","Type":"ContainerStarted","Data":"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.534530 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" event={"ID":"e9ed0de4-eed8-4b46-857d-61938ae2d9d4","Type":"ContainerStarted","Data":"94daa85dfba3f9fc3fd973ff3d1b738a7d757cdcd8c9a6a6c9f63d368bb46ec2"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.535695 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.546248 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"02e21f2d-7c3e-489e-a49c-c800d328a1cd","Type":"ContainerStarted","Data":"fafc2943304aaf68877c588931f9be4f38d331b925b5d21800937d8c465f490d"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.546306 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"02e21f2d-7c3e-489e-a49c-c800d328a1cd","Type":"ContainerStarted","Data":"490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075"} Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.547062 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:21 crc kubenswrapper[4856]: E0320 13:27:21.553200 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-w59xx" podUID="b99e422b-ccde-422a-869f-7898a008a66a" Mar 20 13:27:21 crc kubenswrapper[4856]: E0320 13:27:21.553415 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-tgq7q" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" Mar 20 13:27:21 crc kubenswrapper[4856]: E0320 13:27:21.554159 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-62lq9" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.597396 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" podStartSLOduration=19.597381729 podStartE2EDuration="19.597381729s" podCreationTimestamp="2026-03-20 13:27:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:21.596557098 +0000 UTC m=+256.477583238" watchObservedRunningTime="2026-03-20 13:27:21.597381729 +0000 UTC m=+256.478407859" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.664186 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.664170438 podStartE2EDuration="2.664170438s" podCreationTimestamp="2026-03-20 13:27:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:21.644608999 +0000 UTC m=+256.525635159" watchObservedRunningTime="2026-03-20 13:27:21.664170438 +0000 UTC m=+256.545196568" Mar 20 13:27:21 crc kubenswrapper[4856]: I0320 13:27:21.949083 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.038737 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.560569 4856 generic.go:334] "Generic (PLEG): container finished" podID="02e21f2d-7c3e-489e-a49c-c800d328a1cd" containerID="fafc2943304aaf68877c588931f9be4f38d331b925b5d21800937d8c465f490d" exitCode=0 Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.560616 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"02e21f2d-7c3e-489e-a49c-c800d328a1cd","Type":"ContainerDied","Data":"fafc2943304aaf68877c588931f9be4f38d331b925b5d21800937d8c465f490d"} Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.565397 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerStarted","Data":"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0"} Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.571764 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qtlvp" event={"ID":"2ab45c47-75d7-4d3f-b56c-45cc5ccbbfca","Type":"ContainerStarted","Data":"370d0b92d24d1d127106b0456c8c2ec49dc75265df98657dec7a1d50617651b0"} Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.584536 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerStarted","Data":"fb1f4cb66483458c7c1a3403def290d898c387ea7b63f51781d9e7be835368d1"} Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.587841 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerStarted","Data":"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74"} Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.607255 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qtlvp" podStartSLOduration=208.607233518 podStartE2EDuration="3m28.607233518s" podCreationTimestamp="2026-03-20 13:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:22.602459694 +0000 UTC m=+257.483485824" watchObservedRunningTime="2026-03-20 13:27:22.607233518 +0000 UTC m=+257.488259648" Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.621786 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5w74v" podStartSLOduration=2.8721626799999997 podStartE2EDuration="39.621765307s" podCreationTimestamp="2026-03-20 13:26:43 +0000 UTC" firstStartedPulling="2026-03-20 13:26:45.395909092 +0000 UTC m=+220.276935222" lastFinishedPulling="2026-03-20 13:27:22.145511709 +0000 UTC m=+257.026537849" observedRunningTime="2026-03-20 13:27:22.619632241 +0000 UTC m=+257.500658371" watchObservedRunningTime="2026-03-20 13:27:22.621765307 +0000 UTC m=+257.502791437" Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.643611 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ngck4" podStartSLOduration=3.865660577 podStartE2EDuration="41.643592334s" podCreationTimestamp="2026-03-20 13:26:41 +0000 UTC" firstStartedPulling="2026-03-20 13:26:44.296177543 +0000 UTC m=+219.177203673" lastFinishedPulling="2026-03-20 13:27:22.0741093 +0000 UTC m=+256.955135430" observedRunningTime="2026-03-20 13:27:22.641854129 +0000 UTC m=+257.522880259" watchObservedRunningTime="2026-03-20 13:27:22.643592334 +0000 UTC m=+257.524618464" Mar 20 13:27:22 crc kubenswrapper[4856]: I0320 13:27:22.659824 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dqwl2" podStartSLOduration=3.833375769 podStartE2EDuration="41.659809507s" podCreationTimestamp="2026-03-20 13:26:41 +0000 UTC" firstStartedPulling="2026-03-20 13:26:44.300620909 +0000 UTC m=+219.181647039" lastFinishedPulling="2026-03-20 13:27:22.127054647 +0000 UTC m=+257.008080777" observedRunningTime="2026-03-20 13:27:22.657668341 +0000 UTC m=+257.538694491" watchObservedRunningTime="2026-03-20 13:27:22.659809507 +0000 UTC m=+257.540835637" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.595866 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" podUID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" containerName="route-controller-manager" containerID="cri-o://d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.596398 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" podUID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" containerName="controller-manager" containerID="cri-o://4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51" gracePeriod=30 Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.833795 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.833959 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.866965 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.913060 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access\") pod \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.913136 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir\") pod \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\" (UID: \"02e21f2d-7c3e-489e-a49c-c800d328a1cd\") " Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.913355 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "02e21f2d-7c3e-489e-a49c-c800d328a1cd" (UID: "02e21f2d-7c3e-489e-a49c-c800d328a1cd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.918412 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "02e21f2d-7c3e-489e-a49c-c800d328a1cd" (UID: "02e21f2d-7c3e-489e-a49c-c800d328a1cd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:23 crc kubenswrapper[4856]: I0320 13:27:23.999197 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.013782 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lbpv\" (UniqueName: \"kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv\") pod \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.013839 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config\") pod \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.013879 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert\") pod \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.013902 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca\") pod \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\" (UID: \"e9ed0de4-eed8-4b46-857d-61938ae2d9d4\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.014039 4856 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.014050 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e21f2d-7c3e-489e-a49c-c800d328a1cd-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.014951 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config" (OuterVolumeSpecName: "config") pod "e9ed0de4-eed8-4b46-857d-61938ae2d9d4" (UID: "e9ed0de4-eed8-4b46-857d-61938ae2d9d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.016440 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "e9ed0de4-eed8-4b46-857d-61938ae2d9d4" (UID: "e9ed0de4-eed8-4b46-857d-61938ae2d9d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.017295 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e9ed0de4-eed8-4b46-857d-61938ae2d9d4" (UID: "e9ed0de4-eed8-4b46-857d-61938ae2d9d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.017396 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv" (OuterVolumeSpecName: "kube-api-access-2lbpv") pod "e9ed0de4-eed8-4b46-857d-61938ae2d9d4" (UID: "e9ed0de4-eed8-4b46-857d-61938ae2d9d4"). InnerVolumeSpecName "kube-api-access-2lbpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.072051 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119569 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ks6k\" (UniqueName: \"kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k\") pod \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119624 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert\") pod \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119666 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config\") pod \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119686 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles\") pod \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119704 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca\") pod \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\" (UID: \"9d7126d8-9ade-4ab3-8563-2e18034bab6d\") " Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119856 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119867 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119877 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lbpv\" (UniqueName: \"kubernetes.io/projected/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-kube-api-access-2lbpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.119885 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9ed0de4-eed8-4b46-857d-61938ae2d9d4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.120255 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d7126d8-9ade-4ab3-8563-2e18034bab6d" (UID: "9d7126d8-9ade-4ab3-8563-2e18034bab6d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.120287 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9d7126d8-9ade-4ab3-8563-2e18034bab6d" (UID: "9d7126d8-9ade-4ab3-8563-2e18034bab6d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.120492 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config" (OuterVolumeSpecName: "config") pod "9d7126d8-9ade-4ab3-8563-2e18034bab6d" (UID: "9d7126d8-9ade-4ab3-8563-2e18034bab6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.121497 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k" (OuterVolumeSpecName: "kube-api-access-6ks6k") pod "9d7126d8-9ade-4ab3-8563-2e18034bab6d" (UID: "9d7126d8-9ade-4ab3-8563-2e18034bab6d"). InnerVolumeSpecName "kube-api-access-6ks6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.122143 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d7126d8-9ade-4ab3-8563-2e18034bab6d" (UID: "9d7126d8-9ade-4ab3-8563-2e18034bab6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.220506 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ks6k\" (UniqueName: \"kubernetes.io/projected/9d7126d8-9ade-4ab3-8563-2e18034bab6d-kube-api-access-6ks6k\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.220540 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d7126d8-9ade-4ab3-8563-2e18034bab6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.220550 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.220559 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.220568 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d7126d8-9ade-4ab3-8563-2e18034bab6d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.603403 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"02e21f2d-7c3e-489e-a49c-c800d328a1cd","Type":"ContainerDied","Data":"490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075"} Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.603447 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="490545f31d2c3a33bc7fffe1b1f24ee625e3fb045277f7b7731e5013f671c075" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.604489 4856 generic.go:334] "Generic (PLEG): container finished" podID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" containerID="d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.604528 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.603425 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.604542 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" event={"ID":"e9ed0de4-eed8-4b46-857d-61938ae2d9d4","Type":"ContainerDied","Data":"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c"} Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.604620 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79" event={"ID":"e9ed0de4-eed8-4b46-857d-61938ae2d9d4","Type":"ContainerDied","Data":"94daa85dfba3f9fc3fd973ff3d1b738a7d757cdcd8c9a6a6c9f63d368bb46ec2"} Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.604646 4856 scope.go:117] "RemoveContainer" containerID="d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.605887 4856 generic.go:334] "Generic (PLEG): container finished" podID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" containerID="4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51" exitCode=0 Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.605938 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.605961 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" event={"ID":"9d7126d8-9ade-4ab3-8563-2e18034bab6d","Type":"ContainerDied","Data":"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51"} Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.605976 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8565cd6-cm4dl" event={"ID":"9d7126d8-9ade-4ab3-8563-2e18034bab6d","Type":"ContainerDied","Data":"6ad46335c07469daee74e46ab25bd50f42453fc43740af014c14959361c98e21"} Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.620814 4856 scope.go:117] "RemoveContainer" containerID="d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c" Mar 20 13:27:24 crc kubenswrapper[4856]: E0320 13:27:24.621339 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c\": container with ID starting with d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c not found: ID does not exist" containerID="d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.621393 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c"} err="failed to get container status \"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c\": rpc error: code = NotFound desc = could not find container \"d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c\": container with ID starting with d0efe843f44901b1a5e9f230235325b4c814856720f56f62fa85dcc8a950299c not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.621425 4856 scope.go:117] "RemoveContainer" containerID="4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.640736 4856 scope.go:117] "RemoveContainer" containerID="4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51" Mar 20 13:27:24 crc kubenswrapper[4856]: E0320 13:27:24.642305 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51\": container with ID starting with 4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51 not found: ID does not exist" containerID="4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.642339 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51"} err="failed to get container status \"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51\": rpc error: code = NotFound desc = could not find container \"4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51\": container with ID starting with 4883eafe25e5378359f33c880d68163671780338478ccfb65016bb1abb500c51 not found: ID does not exist" Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.648202 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.651304 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7b5cf75d59-hrb79"] Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.654917 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.658964 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c8565cd6-cm4dl"] Mar 20 13:27:24 crc kubenswrapper[4856]: I0320 13:27:24.985948 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5w74v" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="registry-server" probeResult="failure" output=< Mar 20 13:27:24 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:27:24 crc kubenswrapper[4856]: > Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057618 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:27:25 crc kubenswrapper[4856]: E0320 13:27:25.057820 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" containerName="controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057832 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" containerName="controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: E0320 13:27:25.057844 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" containerName="route-controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057851 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" containerName="route-controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: E0320 13:27:25.057868 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02e21f2d-7c3e-489e-a49c-c800d328a1cd" containerName="pruner" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057874 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="02e21f2d-7c3e-489e-a49c-c800d328a1cd" containerName="pruner" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057959 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" containerName="controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057972 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="02e21f2d-7c3e-489e-a49c-c800d328a1cd" containerName="pruner" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.057978 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" containerName="route-controller-manager" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.058311 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.061523 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.061740 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.069801 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.231245 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.231323 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.231341 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.332742 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.332815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.332839 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.332914 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.332949 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.349714 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access\") pod \"installer-9-crc\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.390593 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.771877 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.839820 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7126d8-9ade-4ab3-8563-2e18034bab6d" path="/var/lib/kubelet/pods/9d7126d8-9ade-4ab3-8563-2e18034bab6d/volumes" Mar 20 13:27:25 crc kubenswrapper[4856]: I0320 13:27:25.840591 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ed0de4-eed8-4b46-857d-61938ae2d9d4" path="/var/lib/kubelet/pods/e9ed0de4-eed8-4b46-857d-61938ae2d9d4/volumes" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.253102 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.253863 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.258739 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.259582 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.260891 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.261328 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.261526 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.261811 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.262059 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.262499 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.266228 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.267897 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.267897 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.268057 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.268116 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.268180 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.269626 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.271550 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.274015 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345633 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345691 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345730 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345753 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345776 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6znr6\" (UniqueName: \"kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.345962 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rprx\" (UniqueName: \"kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.346064 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.346227 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.346374 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.447353 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.447751 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.447777 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.447808 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.448986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449092 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449147 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449173 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6znr6\" (UniqueName: \"kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449217 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rprx\" (UniqueName: \"kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449255 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449288 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.449288 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.450020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.453762 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.457121 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.462171 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.463804 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rprx\" (UniqueName: \"kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx\") pod \"route-controller-manager-7f86f49694-brwkx\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.465807 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6znr6\" (UniqueName: \"kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6\") pod \"controller-manager-587f8d8786-nr6qv\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.587717 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.600533 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.621258 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffbebe8f-fc28-4542-8f04-f939ea62d4f8","Type":"ContainerStarted","Data":"dbf3f309a3dc6f3a2b9ee37dcefa9f2f4d6b0c391feb27b9407993ee48585edd"} Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.621482 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffbebe8f-fc28-4542-8f04-f939ea62d4f8","Type":"ContainerStarted","Data":"0457b24a82a38d1ecedfb9392fa8fa0e091a952a7a2bb02f4229e89a3af8cf3a"} Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.791615 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.79159739 podStartE2EDuration="1.79159739s" podCreationTimestamp="2026-03-20 13:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:26.639102523 +0000 UTC m=+261.520128663" watchObservedRunningTime="2026-03-20 13:27:26.79159739 +0000 UTC m=+261.672623520" Mar 20 13:27:26 crc kubenswrapper[4856]: I0320 13:27:26.798992 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.064735 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:27 crc kubenswrapper[4856]: W0320 13:27:27.065324 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fcce8fc_8c6b_4cd9_bfe2_de7eb2ef8a47.slice/crio-ef1efdc539ffd0b10a359fed15d91f39e45cd5c7631d8cd97b2b14eca89766c3 WatchSource:0}: Error finding container ef1efdc539ffd0b10a359fed15d91f39e45cd5c7631d8cd97b2b14eca89766c3: Status 404 returned error can't find the container with id ef1efdc539ffd0b10a359fed15d91f39e45cd5c7631d8cd97b2b14eca89766c3 Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.626913 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" event={"ID":"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50","Type":"ContainerStarted","Data":"59508e8f47f0cb82a25b4b3ce704881c112ad30a1d908080711b22f0008af3cf"} Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.627350 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" event={"ID":"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50","Type":"ContainerStarted","Data":"b734c7946ba18799f9b6d9fb656942a8bfbc3870dee2fa148dca8f563b1026e0"} Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.627367 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.628532 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" event={"ID":"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47","Type":"ContainerStarted","Data":"283dff841ba69b36a66a5b83e387a643eb0390f6ce25023a0362194e21330b67"} Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.628579 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" event={"ID":"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47","Type":"ContainerStarted","Data":"ef1efdc539ffd0b10a359fed15d91f39e45cd5c7631d8cd97b2b14eca89766c3"} Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.631772 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.664711 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" podStartSLOduration=5.664695085 podStartE2EDuration="5.664695085s" podCreationTimestamp="2026-03-20 13:27:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:27.664095607 +0000 UTC m=+262.545121747" watchObservedRunningTime="2026-03-20 13:27:27.664695085 +0000 UTC m=+262.545721215" Mar 20 13:27:27 crc kubenswrapper[4856]: I0320 13:27:27.667071 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" podStartSLOduration=6.6670576950000005 podStartE2EDuration="6.667057695s" podCreationTimestamp="2026-03-20 13:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:27.646455933 +0000 UTC m=+262.527482073" watchObservedRunningTime="2026-03-20 13:27:27.667057695 +0000 UTC m=+262.548083825" Mar 20 13:27:28 crc kubenswrapper[4856]: I0320 13:27:28.633427 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:28 crc kubenswrapper[4856]: I0320 13:27:28.639569 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.036743 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.037054 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.183966 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.240991 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.241043 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.284399 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.708738 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:32 crc kubenswrapper[4856]: I0320 13:27:32.711303 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:27:33 crc kubenswrapper[4856]: I0320 13:27:33.047108 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:27:33 crc kubenswrapper[4856]: I0320 13:27:33.672873 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e32090f-521c-4585-884b-650644c11aee" containerID="503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822" exitCode=0 Mar 20 13:27:33 crc kubenswrapper[4856]: I0320 13:27:33.672952 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerDied","Data":"503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822"} Mar 20 13:27:33 crc kubenswrapper[4856]: I0320 13:27:33.872117 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:27:33 crc kubenswrapper[4856]: I0320 13:27:33.915099 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:27:34 crc kubenswrapper[4856]: I0320 13:27:34.680771 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ngck4" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="registry-server" containerID="cri-o://fb1f4cb66483458c7c1a3403def290d898c387ea7b63f51781d9e7be835368d1" gracePeriod=2 Mar 20 13:27:35 crc kubenswrapper[4856]: I0320 13:27:35.689324 4856 generic.go:334] "Generic (PLEG): container finished" podID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerID="fb1f4cb66483458c7c1a3403def290d898c387ea7b63f51781d9e7be835368d1" exitCode=0 Mar 20 13:27:35 crc kubenswrapper[4856]: I0320 13:27:35.689377 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerDied","Data":"fb1f4cb66483458c7c1a3403def290d898c387ea7b63f51781d9e7be835368d1"} Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.404123 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.477094 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content\") pod \"0ab25c01-5fc8-4432-ab97-16a816666e4f\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.479717 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcc92\" (UniqueName: \"kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92\") pod \"0ab25c01-5fc8-4432-ab97-16a816666e4f\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.479923 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities\") pod \"0ab25c01-5fc8-4432-ab97-16a816666e4f\" (UID: \"0ab25c01-5fc8-4432-ab97-16a816666e4f\") " Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.480976 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities" (OuterVolumeSpecName: "utilities") pod "0ab25c01-5fc8-4432-ab97-16a816666e4f" (UID: "0ab25c01-5fc8-4432-ab97-16a816666e4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.481178 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.487371 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92" (OuterVolumeSpecName: "kube-api-access-qcc92") pod "0ab25c01-5fc8-4432-ab97-16a816666e4f" (UID: "0ab25c01-5fc8-4432-ab97-16a816666e4f"). InnerVolumeSpecName "kube-api-access-qcc92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.541484 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ab25c01-5fc8-4432-ab97-16a816666e4f" (UID: "0ab25c01-5fc8-4432-ab97-16a816666e4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.582122 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ab25c01-5fc8-4432-ab97-16a816666e4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.582178 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcc92\" (UniqueName: \"kubernetes.io/projected/0ab25c01-5fc8-4432-ab97-16a816666e4f-kube-api-access-qcc92\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.697690 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerStarted","Data":"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7"} Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.699883 4856 generic.go:334] "Generic (PLEG): container finished" podID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerID="f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7" exitCode=0 Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.699933 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerDied","Data":"f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7"} Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.703723 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" event={"ID":"46f98403-30f8-40f6-afa6-6defe5937024","Type":"ContainerStarted","Data":"5bfb811dcbf7df7ee894bb7676ce502633e87593535c51987807044b32e9688c"} Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.708733 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ngck4" event={"ID":"0ab25c01-5fc8-4432-ab97-16a816666e4f","Type":"ContainerDied","Data":"fd46369b715b54706d28676e0bdfc5e06b580929d479c801301c5dbbbb0cd0ba"} Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.708806 4856 scope.go:117] "RemoveContainer" containerID="fb1f4cb66483458c7c1a3403def290d898c387ea7b63f51781d9e7be835368d1" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.708879 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ngck4" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.765849 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" podStartSLOduration=39.097896059 podStartE2EDuration="1m36.765829849s" podCreationTimestamp="2026-03-20 13:26:00 +0000 UTC" firstStartedPulling="2026-03-20 13:26:38.525741546 +0000 UTC m=+213.406767676" lastFinishedPulling="2026-03-20 13:27:36.193675316 +0000 UTC m=+271.074701466" observedRunningTime="2026-03-20 13:27:36.743255548 +0000 UTC m=+271.624281688" watchObservedRunningTime="2026-03-20 13:27:36.765829849 +0000 UTC m=+271.646855999" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.766290 4856 scope.go:117] "RemoveContainer" containerID="643eb7804d77f9b94f9e8a6b223c2071ee1af944d4bbbf31e9ee41ddafba9a15" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.793617 4856 scope.go:117] "RemoveContainer" containerID="85e76a5bdc9146299973b51fe02b98cd3d26f5f403a9956ddf0f68966bd945c9" Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.794958 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.798003 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ngck4"] Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.946198 4856 csr.go:261] certificate signing request csr-6spq5 is approved, waiting to be issued Mar 20 13:27:36 crc kubenswrapper[4856]: I0320 13:27:36.957799 4856 csr.go:257] certificate signing request csr-6spq5 is issued Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.717699 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerStarted","Data":"6776f60792a1a4da0b8106c89896bf30d8c061604901f4f40a83db5e00cb9542"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.720216 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerStarted","Data":"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.721890 4856 generic.go:334] "Generic (PLEG): container finished" podID="46f98403-30f8-40f6-afa6-6defe5937024" containerID="5bfb811dcbf7df7ee894bb7676ce502633e87593535c51987807044b32e9688c" exitCode=0 Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.721938 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" event={"ID":"46f98403-30f8-40f6-afa6-6defe5937024","Type":"ContainerDied","Data":"5bfb811dcbf7df7ee894bb7676ce502633e87593535c51987807044b32e9688c"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.724870 4856 generic.go:334] "Generic (PLEG): container finished" podID="b99e422b-ccde-422a-869f-7898a008a66a" containerID="a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804" exitCode=0 Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.724956 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerDied","Data":"a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.726887 4856 generic.go:334] "Generic (PLEG): container finished" podID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerID="25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7" exitCode=0 Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.726939 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerDied","Data":"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.731320 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerStarted","Data":"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5"} Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.808986 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m6lhf" podStartSLOduration=2.9576129289999997 podStartE2EDuration="56.808963612s" podCreationTimestamp="2026-03-20 13:26:41 +0000 UTC" firstStartedPulling="2026-03-20 13:26:43.256529949 +0000 UTC m=+218.137556079" lastFinishedPulling="2026-03-20 13:27:37.107880622 +0000 UTC m=+271.988906762" observedRunningTime="2026-03-20 13:27:37.803771798 +0000 UTC m=+272.684797938" watchObservedRunningTime="2026-03-20 13:27:37.808963612 +0000 UTC m=+272.689989752" Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.829319 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" path="/var/lib/kubelet/pods/0ab25c01-5fc8-4432-ab97-16a816666e4f/volumes" Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.844194 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7jsvf" podStartSLOduration=3.668426533 podStartE2EDuration="54.844176397s" podCreationTimestamp="2026-03-20 13:26:43 +0000 UTC" firstStartedPulling="2026-03-20 13:26:45.388064528 +0000 UTC m=+220.269090668" lastFinishedPulling="2026-03-20 13:27:36.563814412 +0000 UTC m=+271.444840532" observedRunningTime="2026-03-20 13:27:37.841062935 +0000 UTC m=+272.722089075" watchObservedRunningTime="2026-03-20 13:27:37.844176397 +0000 UTC m=+272.725202537" Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.959793 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 02:47:32.923262065 +0000 UTC Mar 20 13:27:37 crc kubenswrapper[4856]: I0320 13:27:37.959836 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6589h19m54.963428425s for next certificate rotation Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.737586 4856 generic.go:334] "Generic (PLEG): container finished" podID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerID="6776f60792a1a4da0b8106c89896bf30d8c061604901f4f40a83db5e00cb9542" exitCode=0 Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.737763 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerDied","Data":"6776f60792a1a4da0b8106c89896bf30d8c061604901f4f40a83db5e00cb9542"} Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.740840 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerStarted","Data":"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc"} Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.743705 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerStarted","Data":"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594"} Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.791623 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tgq7q" podStartSLOduration=2.969377038 podStartE2EDuration="54.791606368s" podCreationTimestamp="2026-03-20 13:26:44 +0000 UTC" firstStartedPulling="2026-03-20 13:26:46.56514094 +0000 UTC m=+221.446167070" lastFinishedPulling="2026-03-20 13:27:38.38737026 +0000 UTC m=+273.268396400" observedRunningTime="2026-03-20 13:27:38.789164507 +0000 UTC m=+273.670190637" watchObservedRunningTime="2026-03-20 13:27:38.791606368 +0000 UTC m=+273.672632498" Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.803130 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-w59xx" podStartSLOduration=2.808271746 podStartE2EDuration="57.803112531s" podCreationTimestamp="2026-03-20 13:26:41 +0000 UTC" firstStartedPulling="2026-03-20 13:26:43.240173563 +0000 UTC m=+218.121199693" lastFinishedPulling="2026-03-20 13:27:38.235014348 +0000 UTC m=+273.116040478" observedRunningTime="2026-03-20 13:27:38.80206824 +0000 UTC m=+273.683094390" watchObservedRunningTime="2026-03-20 13:27:38.803112531 +0000 UTC m=+273.684138661" Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.960330 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-21 23:38:06.667640062 +0000 UTC Mar 20 13:27:38 crc kubenswrapper[4856]: I0320 13:27:38.960360 4856 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6634h10m27.707282314s for next certificate rotation Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.097217 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.111007 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db9nr\" (UniqueName: \"kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr\") pod \"46f98403-30f8-40f6-afa6-6defe5937024\" (UID: \"46f98403-30f8-40f6-afa6-6defe5937024\") " Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.116210 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr" (OuterVolumeSpecName: "kube-api-access-db9nr") pod "46f98403-30f8-40f6-afa6-6defe5937024" (UID: "46f98403-30f8-40f6-afa6-6defe5937024"). InnerVolumeSpecName "kube-api-access-db9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.212765 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-db9nr\" (UniqueName: \"kubernetes.io/projected/46f98403-30f8-40f6-afa6-6defe5937024-kube-api-access-db9nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.752004 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" event={"ID":"46f98403-30f8-40f6-afa6-6defe5937024","Type":"ContainerDied","Data":"c57fd1993efc2d36b45797f84736b8ae5e64ca628dff5b1afe6e152dc5eb7765"} Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.752386 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c57fd1993efc2d36b45797f84736b8ae5e64ca628dff5b1afe6e152dc5eb7765" Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.752064 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-xkbwc" Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.759111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerStarted","Data":"7d78c6a727eedf9dcf3c7473bba42603326d49827057f7d9fb67c5b5664e0329"} Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.987614 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:27:39 crc kubenswrapper[4856]: I0320 13:27:39.987707 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.418911 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.419234 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.466777 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.493698 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-62lq9" podStartSLOduration=3.810149069 podStartE2EDuration="56.493677063s" podCreationTimestamp="2026-03-20 13:26:45 +0000 UTC" firstStartedPulling="2026-03-20 13:26:46.479461659 +0000 UTC m=+221.360487789" lastFinishedPulling="2026-03-20 13:27:39.162989653 +0000 UTC m=+274.044015783" observedRunningTime="2026-03-20 13:27:39.799082753 +0000 UTC m=+274.680108943" watchObservedRunningTime="2026-03-20 13:27:41.493677063 +0000 UTC m=+276.374703193" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.840762 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.840846 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.916522 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.992075 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.992365 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" podUID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" containerName="controller-manager" containerID="cri-o://59508e8f47f0cb82a25b4b3ce704881c112ad30a1d908080711b22f0008af3cf" gracePeriod=30 Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.994892 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:41 crc kubenswrapper[4856]: I0320 13:27:41.995138 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" podUID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" containerName="route-controller-manager" containerID="cri-o://283dff841ba69b36a66a5b83e387a643eb0390f6ce25023a0362194e21330b67" gracePeriod=30 Mar 20 13:27:42 crc kubenswrapper[4856]: I0320 13:27:42.778398 4856 generic.go:334] "Generic (PLEG): container finished" podID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" containerID="283dff841ba69b36a66a5b83e387a643eb0390f6ce25023a0362194e21330b67" exitCode=0 Mar 20 13:27:42 crc kubenswrapper[4856]: I0320 13:27:42.778471 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" event={"ID":"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47","Type":"ContainerDied","Data":"283dff841ba69b36a66a5b83e387a643eb0390f6ce25023a0362194e21330b67"} Mar 20 13:27:42 crc kubenswrapper[4856]: I0320 13:27:42.782225 4856 generic.go:334] "Generic (PLEG): container finished" podID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" containerID="59508e8f47f0cb82a25b4b3ce704881c112ad30a1d908080711b22f0008af3cf" exitCode=0 Mar 20 13:27:42 crc kubenswrapper[4856]: I0320 13:27:42.782315 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" event={"ID":"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50","Type":"ContainerDied","Data":"59508e8f47f0cb82a25b4b3ce704881c112ad30a1d908080711b22f0008af3cf"} Mar 20 13:27:42 crc kubenswrapper[4856]: I0320 13:27:42.834284 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.314969 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351469 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:27:43 crc kubenswrapper[4856]: E0320 13:27:43.351725 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46f98403-30f8-40f6-afa6-6defe5937024" containerName="oc" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351747 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="46f98403-30f8-40f6-afa6-6defe5937024" containerName="oc" Mar 20 13:27:43 crc kubenswrapper[4856]: E0320 13:27:43.351758 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="registry-server" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351766 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="registry-server" Mar 20 13:27:43 crc kubenswrapper[4856]: E0320 13:27:43.351782 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" containerName="route-controller-manager" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351789 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" containerName="route-controller-manager" Mar 20 13:27:43 crc kubenswrapper[4856]: E0320 13:27:43.351799 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="extract-utilities" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351807 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="extract-utilities" Mar 20 13:27:43 crc kubenswrapper[4856]: E0320 13:27:43.351819 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="extract-content" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351826 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="extract-content" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351951 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" containerName="route-controller-manager" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351968 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="46f98403-30f8-40f6-afa6-6defe5937024" containerName="oc" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.351990 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab25c01-5fc8-4432-ab97-16a816666e4f" containerName="registry-server" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.352465 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.364452 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rprx\" (UniqueName: \"kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx\") pod \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.364501 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config\") pod \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.364709 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert\") pod \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365257 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config" (OuterVolumeSpecName: "config") pod "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" (UID: "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365314 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca" (OuterVolumeSpecName: "client-ca") pod "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" (UID: "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365451 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca\") pod \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\" (UID: \"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365565 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365604 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365671 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365699 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq4xd\" (UniqueName: \"kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365804 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.365818 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.372712 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx" (OuterVolumeSpecName: "kube-api-access-2rprx") pod "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" (UID: "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47"). InnerVolumeSpecName "kube-api-access-2rprx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.376384 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" (UID: "1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.376463 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466480 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq4xd\" (UniqueName: \"kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466550 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466578 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466629 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466665 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rprx\" (UniqueName: \"kubernetes.io/projected/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-kube-api-access-2rprx\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.466675 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.467598 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.467707 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.470059 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.470331 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.482774 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq4xd\" (UniqueName: \"kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd\") pod \"route-controller-manager-6dc86d4797-c9cjs\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.568016 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6znr6\" (UniqueName: \"kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6\") pod \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.568131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config\") pod \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.568178 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert\") pod \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.568209 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles\") pod \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.568240 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca\") pod \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\" (UID: \"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50\") " Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.569030 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config" (OuterVolumeSpecName: "config") pod "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" (UID: "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.569075 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" (UID: "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.569237 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" (UID: "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.571407 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6" (OuterVolumeSpecName: "kube-api-access-6znr6") pod "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" (UID: "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50"). InnerVolumeSpecName "kube-api-access-6znr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.573342 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" (UID: "8cb51ac6-f52e-4071-9b5f-9f93f2b63f50"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.664342 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.668959 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6znr6\" (UniqueName: \"kubernetes.io/projected/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-kube-api-access-6znr6\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.669188 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.669197 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.669205 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.669213 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.798177 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.798172 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx" event={"ID":"1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47","Type":"ContainerDied","Data":"ef1efdc539ffd0b10a359fed15d91f39e45cd5c7631d8cd97b2b14eca89766c3"} Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.798237 4856 scope.go:117] "RemoveContainer" containerID="283dff841ba69b36a66a5b83e387a643eb0390f6ce25023a0362194e21330b67" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.804328 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.804820 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-587f8d8786-nr6qv" event={"ID":"8cb51ac6-f52e-4071-9b5f-9f93f2b63f50","Type":"ContainerDied","Data":"b734c7946ba18799f9b6d9fb656942a8bfbc3870dee2fa148dca8f563b1026e0"} Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.830201 4856 scope.go:117] "RemoveContainer" containerID="59508e8f47f0cb82a25b4b3ce704881c112ad30a1d908080711b22f0008af3cf" Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.855725 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.855967 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f86f49694-brwkx"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.855991 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.856004 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.857377 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-587f8d8786-nr6qv"] Mar 20 13:27:43 crc kubenswrapper[4856]: I0320 13:27:43.894405 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.156753 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.157174 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.203138 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.810941 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" event={"ID":"afd2628d-7dc7-44fb-b75f-8fdd42749961","Type":"ContainerStarted","Data":"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea"} Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.811411 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.811436 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" event={"ID":"afd2628d-7dc7-44fb-b75f-8fdd42749961","Type":"ContainerStarted","Data":"3628ee840fe17a9e7d6cd25fce0acc986996923332d52ae7839d33a506072852"} Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.814150 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m6lhf" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="registry-server" containerID="cri-o://6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5" gracePeriod=2 Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.820123 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.830014 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" podStartSLOduration=2.829987523 podStartE2EDuration="2.829987523s" podCreationTimestamp="2026-03-20 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:44.823857091 +0000 UTC m=+279.704883231" watchObservedRunningTime="2026-03-20 13:27:44.829987523 +0000 UTC m=+279.711013693" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.871324 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.991863 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:27:44 crc kubenswrapper[4856]: I0320 13:27:44.991927 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.179911 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.286174 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdvkj\" (UniqueName: \"kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj\") pod \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.286285 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities\") pod \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.286337 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content\") pod \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\" (UID: \"1433d94f-1c65-49ca-a1ed-cb24d864eb55\") " Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.287144 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities" (OuterVolumeSpecName: "utilities") pod "1433d94f-1c65-49ca-a1ed-cb24d864eb55" (UID: "1433d94f-1c65-49ca-a1ed-cb24d864eb55"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.291547 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj" (OuterVolumeSpecName: "kube-api-access-fdvkj") pod "1433d94f-1c65-49ca-a1ed-cb24d864eb55" (UID: "1433d94f-1c65-49ca-a1ed-cb24d864eb55"). InnerVolumeSpecName "kube-api-access-fdvkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.346403 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1433d94f-1c65-49ca-a1ed-cb24d864eb55" (UID: "1433d94f-1c65-49ca-a1ed-cb24d864eb55"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.387830 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdvkj\" (UniqueName: \"kubernetes.io/projected/1433d94f-1c65-49ca-a1ed-cb24d864eb55-kube-api-access-fdvkj\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.387873 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.387885 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1433d94f-1c65-49ca-a1ed-cb24d864eb55-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.402733 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.403402 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.824321 4856 generic.go:334] "Generic (PLEG): container finished" podID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerID="6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5" exitCode=0 Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.824585 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m6lhf" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.826140 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47" path="/var/lib/kubelet/pods/1fcce8fc-8c6b-4cd9-bfe2-de7eb2ef8a47/volumes" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.827465 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" path="/var/lib/kubelet/pods/8cb51ac6-f52e-4071-9b5f-9f93f2b63f50/volumes" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.828883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerDied","Data":"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5"} Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.828935 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m6lhf" event={"ID":"1433d94f-1c65-49ca-a1ed-cb24d864eb55","Type":"ContainerDied","Data":"fc5ad811eac3972321c3b8fd197955d720900f8d14c74fbf6e28b860e35ac3b9"} Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.828977 4856 scope.go:117] "RemoveContainer" containerID="6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.851429 4856 scope.go:117] "RemoveContainer" containerID="f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.862222 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.864907 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m6lhf"] Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.869969 4856 scope.go:117] "RemoveContainer" containerID="d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.888126 4856 scope.go:117] "RemoveContainer" containerID="6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5" Mar 20 13:27:45 crc kubenswrapper[4856]: E0320 13:27:45.888507 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5\": container with ID starting with 6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5 not found: ID does not exist" containerID="6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.888550 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5"} err="failed to get container status \"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5\": rpc error: code = NotFound desc = could not find container \"6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5\": container with ID starting with 6fa9117ba5dac9e62fb5a71b3df3766db2e6c6133811256ec5c406fb2a581bd5 not found: ID does not exist" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.888579 4856 scope.go:117] "RemoveContainer" containerID="f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7" Mar 20 13:27:45 crc kubenswrapper[4856]: E0320 13:27:45.888825 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7\": container with ID starting with f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7 not found: ID does not exist" containerID="f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.888847 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7"} err="failed to get container status \"f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7\": rpc error: code = NotFound desc = could not find container \"f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7\": container with ID starting with f3a38b3e0563cad2bcb0197f9056ec40d612e4cfa66077176905acc67d9771f7 not found: ID does not exist" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.888865 4856 scope.go:117] "RemoveContainer" containerID="d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad" Mar 20 13:27:45 crc kubenswrapper[4856]: E0320 13:27:45.889108 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad\": container with ID starting with d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad not found: ID does not exist" containerID="d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad" Mar 20 13:27:45 crc kubenswrapper[4856]: I0320 13:27:45.889134 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad"} err="failed to get container status \"d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad\": rpc error: code = NotFound desc = could not find container \"d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad\": container with ID starting with d7a5e88774992379387937617f1b2970a800490cd563088bf7fa463b83b8c8ad not found: ID does not exist" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.028983 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tgq7q" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" probeResult="failure" output=< Mar 20 13:27:46 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:27:46 crc kubenswrapper[4856]: > Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.249654 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.265758 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:27:46 crc kubenswrapper[4856]: E0320 13:27:46.266004 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="extract-content" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266020 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="extract-content" Mar 20 13:27:46 crc kubenswrapper[4856]: E0320 13:27:46.266032 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="registry-server" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266040 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="registry-server" Mar 20 13:27:46 crc kubenswrapper[4856]: E0320 13:27:46.266055 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" containerName="controller-manager" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266064 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" containerName="controller-manager" Mar 20 13:27:46 crc kubenswrapper[4856]: E0320 13:27:46.266083 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="extract-utilities" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266091 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="extract-utilities" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266239 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cb51ac6-f52e-4071-9b5f-9f93f2b63f50" containerName="controller-manager" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266256 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" containerName="registry-server" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.266676 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.269061 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.271782 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.272040 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.272102 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.272676 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.273200 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.286021 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.287310 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.398782 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.398827 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.398903 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.399073 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2xhc\" (UniqueName: \"kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.399125 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.448002 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-62lq9" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="registry-server" probeResult="failure" output=< Mar 20 13:27:46 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:27:46 crc kubenswrapper[4856]: > Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.499853 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.499921 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.499993 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.500083 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2xhc\" (UniqueName: \"kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.500115 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.501890 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.502020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.502971 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.507549 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.524320 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2xhc\" (UniqueName: \"kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc\") pod \"controller-manager-5d7df666d7-zq2sw\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:46 crc kubenswrapper[4856]: I0320 13:27:46.583526 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.053974 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:27:47 crc kubenswrapper[4856]: W0320 13:27:47.058850 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d3abf6c_c633_4acd_b499_5caeea41e753.slice/crio-1f4c077640840d05d2ab839bdf5e262ddc9c3083a95a483f6155d7c873289d8f WatchSource:0}: Error finding container 1f4c077640840d05d2ab839bdf5e262ddc9c3083a95a483f6155d7c873289d8f: Status 404 returned error can't find the container with id 1f4c077640840d05d2ab839bdf5e262ddc9c3083a95a483f6155d7c873289d8f Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.828769 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1433d94f-1c65-49ca-a1ed-cb24d864eb55" path="/var/lib/kubelet/pods/1433d94f-1c65-49ca-a1ed-cb24d864eb55/volumes" Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.838817 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" event={"ID":"5d3abf6c-c633-4acd-b499-5caeea41e753","Type":"ContainerStarted","Data":"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8"} Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.838877 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" event={"ID":"5d3abf6c-c633-4acd-b499-5caeea41e753","Type":"ContainerStarted","Data":"1f4c077640840d05d2ab839bdf5e262ddc9c3083a95a483f6155d7c873289d8f"} Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.838954 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7jsvf" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="registry-server" containerID="cri-o://338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160" gracePeriod=2 Mar 20 13:27:47 crc kubenswrapper[4856]: I0320 13:27:47.864672 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" podStartSLOduration=5.86465576 podStartE2EDuration="5.86465576s" podCreationTimestamp="2026-03-20 13:27:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:47.86333394 +0000 UTC m=+282.744360090" watchObservedRunningTime="2026-03-20 13:27:47.86465576 +0000 UTC m=+282.745681890" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.193607 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.224850 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w6wj\" (UniqueName: \"kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj\") pod \"3e32090f-521c-4585-884b-650644c11aee\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.224926 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content\") pod \"3e32090f-521c-4585-884b-650644c11aee\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.225034 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities\") pod \"3e32090f-521c-4585-884b-650644c11aee\" (UID: \"3e32090f-521c-4585-884b-650644c11aee\") " Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.227123 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities" (OuterVolumeSpecName: "utilities") pod "3e32090f-521c-4585-884b-650644c11aee" (UID: "3e32090f-521c-4585-884b-650644c11aee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.230893 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj" (OuterVolumeSpecName: "kube-api-access-5w6wj") pod "3e32090f-521c-4585-884b-650644c11aee" (UID: "3e32090f-521c-4585-884b-650644c11aee"). InnerVolumeSpecName "kube-api-access-5w6wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.253076 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3e32090f-521c-4585-884b-650644c11aee" (UID: "3e32090f-521c-4585-884b-650644c11aee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.326901 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.326952 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w6wj\" (UniqueName: \"kubernetes.io/projected/3e32090f-521c-4585-884b-650644c11aee-kube-api-access-5w6wj\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.326977 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e32090f-521c-4585-884b-650644c11aee-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.851798 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e32090f-521c-4585-884b-650644c11aee" containerID="338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160" exitCode=0 Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.851952 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7jsvf" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.852308 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerDied","Data":"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160"} Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.852535 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.852716 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7jsvf" event={"ID":"3e32090f-521c-4585-884b-650644c11aee","Type":"ContainerDied","Data":"828efe73d78d1217a9adf108b65ddc42a4957c70b2d547b3a62f084d5541de6e"} Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.852589 4856 scope.go:117] "RemoveContainer" containerID="338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.859181 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.870773 4856 scope.go:117] "RemoveContainer" containerID="503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.905772 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.908677 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7jsvf"] Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.908850 4856 scope.go:117] "RemoveContainer" containerID="50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.936692 4856 scope.go:117] "RemoveContainer" containerID="338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160" Mar 20 13:27:48 crc kubenswrapper[4856]: E0320 13:27:48.937788 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160\": container with ID starting with 338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160 not found: ID does not exist" containerID="338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.937825 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160"} err="failed to get container status \"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160\": rpc error: code = NotFound desc = could not find container \"338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160\": container with ID starting with 338f5ccac36a557bd570f30330ce97100bf3aa76679367b2299c8c6de6f99160 not found: ID does not exist" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.937849 4856 scope.go:117] "RemoveContainer" containerID="503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822" Mar 20 13:27:48 crc kubenswrapper[4856]: E0320 13:27:48.938320 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822\": container with ID starting with 503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822 not found: ID does not exist" containerID="503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.938349 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822"} err="failed to get container status \"503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822\": rpc error: code = NotFound desc = could not find container \"503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822\": container with ID starting with 503e91e1eca51c2b66f078407246dc5f378c6b4bddba90a133d62f56250da822 not found: ID does not exist" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.938372 4856 scope.go:117] "RemoveContainer" containerID="50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49" Mar 20 13:27:48 crc kubenswrapper[4856]: E0320 13:27:48.938628 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49\": container with ID starting with 50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49 not found: ID does not exist" containerID="50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49" Mar 20 13:27:48 crc kubenswrapper[4856]: I0320 13:27:48.938656 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49"} err="failed to get container status \"50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49\": rpc error: code = NotFound desc = could not find container \"50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49\": container with ID starting with 50d67f313416a88c2c5cc7ad2c48c7e5a1c6cd8a198e430eaca0dccda10b3f49 not found: ID does not exist" Mar 20 13:27:49 crc kubenswrapper[4856]: I0320 13:27:49.832873 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e32090f-521c-4585-884b-650644c11aee" path="/var/lib/kubelet/pods/3e32090f-521c-4585-884b-650644c11aee/volumes" Mar 20 13:27:51 crc kubenswrapper[4856]: I0320 13:27:51.487047 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:27:55 crc kubenswrapper[4856]: I0320 13:27:55.059248 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:27:55 crc kubenswrapper[4856]: I0320 13:27:55.126631 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:27:55 crc kubenswrapper[4856]: I0320 13:27:55.454845 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:55 crc kubenswrapper[4856]: I0320 13:27:55.530836 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:55 crc kubenswrapper[4856]: I0320 13:27:55.857786 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-lpbh5"] Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.447687 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.448315 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-62lq9" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="registry-server" containerID="cri-o://7d78c6a727eedf9dcf3c7473bba42603326d49827057f7d9fb67c5b5664e0329" gracePeriod=2 Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.917359 4856 generic.go:334] "Generic (PLEG): container finished" podID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerID="7d78c6a727eedf9dcf3c7473bba42603326d49827057f7d9fb67c5b5664e0329" exitCode=0 Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.917398 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerDied","Data":"7d78c6a727eedf9dcf3c7473bba42603326d49827057f7d9fb67c5b5664e0329"} Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.917465 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-62lq9" event={"ID":"54a7156e-6f00-4f4a-98c8-9f592406eea3","Type":"ContainerDied","Data":"8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce"} Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.917483 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c8659496374daa8c763f61b3d147a675948c998a6f7286b1c93ff226f24dcce" Mar 20 13:27:58 crc kubenswrapper[4856]: I0320 13:27:58.928851 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.012345 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities\") pod \"54a7156e-6f00-4f4a-98c8-9f592406eea3\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.012402 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkjtx\" (UniqueName: \"kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx\") pod \"54a7156e-6f00-4f4a-98c8-9f592406eea3\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.013613 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities" (OuterVolumeSpecName: "utilities") pod "54a7156e-6f00-4f4a-98c8-9f592406eea3" (UID: "54a7156e-6f00-4f4a-98c8-9f592406eea3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.013629 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content\") pod \"54a7156e-6f00-4f4a-98c8-9f592406eea3\" (UID: \"54a7156e-6f00-4f4a-98c8-9f592406eea3\") " Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.014111 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.022469 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx" (OuterVolumeSpecName: "kube-api-access-fkjtx") pod "54a7156e-6f00-4f4a-98c8-9f592406eea3" (UID: "54a7156e-6f00-4f4a-98c8-9f592406eea3"). InnerVolumeSpecName "kube-api-access-fkjtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.115127 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkjtx\" (UniqueName: \"kubernetes.io/projected/54a7156e-6f00-4f4a-98c8-9f592406eea3-kube-api-access-fkjtx\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.139308 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54a7156e-6f00-4f4a-98c8-9f592406eea3" (UID: "54a7156e-6f00-4f4a-98c8-9f592406eea3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.216365 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54a7156e-6f00-4f4a-98c8-9f592406eea3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.921435 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-62lq9" Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.940506 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:27:59 crc kubenswrapper[4856]: I0320 13:27:59.948201 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-62lq9"] Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.148876 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566888-wnkb9"] Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149498 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="extract-content" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149519 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="extract-content" Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149537 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="extract-utilities" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149550 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="extract-utilities" Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149571 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149583 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149597 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="extract-utilities" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149608 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="extract-utilities" Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149625 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="extract-content" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149637 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="extract-content" Mar 20 13:28:00 crc kubenswrapper[4856]: E0320 13:28:00.149657 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149669 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149837 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e32090f-521c-4585-884b-650644c11aee" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.149859 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" containerName="registry-server" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.150463 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.160584 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-wnkb9"] Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.162114 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.162319 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.162786 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.329052 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xswk\" (UniqueName: \"kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk\") pod \"auto-csr-approver-29566888-wnkb9\" (UID: \"6f2e80da-b046-416e-9a95-1ebd9beba283\") " pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.430568 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xswk\" (UniqueName: \"kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk\") pod \"auto-csr-approver-29566888-wnkb9\" (UID: \"6f2e80da-b046-416e-9a95-1ebd9beba283\") " pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.463889 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xswk\" (UniqueName: \"kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk\") pod \"auto-csr-approver-29566888-wnkb9\" (UID: \"6f2e80da-b046-416e-9a95-1ebd9beba283\") " pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.472589 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:00 crc kubenswrapper[4856]: I0320 13:28:00.936735 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-wnkb9"] Mar 20 13:28:00 crc kubenswrapper[4856]: W0320 13:28:00.949361 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f2e80da_b046_416e_9a95_1ebd9beba283.slice/crio-a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b WatchSource:0}: Error finding container a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b: Status 404 returned error can't find the container with id a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b Mar 20 13:28:01 crc kubenswrapper[4856]: I0320 13:28:01.831403 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a7156e-6f00-4f4a-98c8-9f592406eea3" path="/var/lib/kubelet/pods/54a7156e-6f00-4f4a-98c8-9f592406eea3/volumes" Mar 20 13:28:01 crc kubenswrapper[4856]: I0320 13:28:01.935776 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" event={"ID":"6f2e80da-b046-416e-9a95-1ebd9beba283","Type":"ContainerStarted","Data":"a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b"} Mar 20 13:28:01 crc kubenswrapper[4856]: I0320 13:28:01.969834 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:28:01 crc kubenswrapper[4856]: I0320 13:28:01.970157 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" podUID="5d3abf6c-c633-4acd-b499-5caeea41e753" containerName="controller-manager" containerID="cri-o://7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8" gracePeriod=30 Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.067069 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.067939 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" podUID="afd2628d-7dc7-44fb-b75f-8fdd42749961" containerName="route-controller-manager" containerID="cri-o://46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea" gracePeriod=30 Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.572394 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.582427 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.757808 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca\") pod \"5d3abf6c-c633-4acd-b499-5caeea41e753\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.757904 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert\") pod \"5d3abf6c-c633-4acd-b499-5caeea41e753\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.757981 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca\") pod \"afd2628d-7dc7-44fb-b75f-8fdd42749961\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758042 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq4xd\" (UniqueName: \"kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd\") pod \"afd2628d-7dc7-44fb-b75f-8fdd42749961\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758148 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config\") pod \"afd2628d-7dc7-44fb-b75f-8fdd42749961\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config\") pod \"5d3abf6c-c633-4acd-b499-5caeea41e753\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758262 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles\") pod \"5d3abf6c-c633-4acd-b499-5caeea41e753\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758374 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2xhc\" (UniqueName: \"kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc\") pod \"5d3abf6c-c633-4acd-b499-5caeea41e753\" (UID: \"5d3abf6c-c633-4acd-b499-5caeea41e753\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.758453 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert\") pod \"afd2628d-7dc7-44fb-b75f-8fdd42749961\" (UID: \"afd2628d-7dc7-44fb-b75f-8fdd42749961\") " Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.759446 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d3abf6c-c633-4acd-b499-5caeea41e753" (UID: "5d3abf6c-c633-4acd-b499-5caeea41e753"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.760189 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5d3abf6c-c633-4acd-b499-5caeea41e753" (UID: "5d3abf6c-c633-4acd-b499-5caeea41e753"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.760501 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config" (OuterVolumeSpecName: "config") pod "5d3abf6c-c633-4acd-b499-5caeea41e753" (UID: "5d3abf6c-c633-4acd-b499-5caeea41e753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.760764 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config" (OuterVolumeSpecName: "config") pod "afd2628d-7dc7-44fb-b75f-8fdd42749961" (UID: "afd2628d-7dc7-44fb-b75f-8fdd42749961"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.761053 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca" (OuterVolumeSpecName: "client-ca") pod "afd2628d-7dc7-44fb-b75f-8fdd42749961" (UID: "afd2628d-7dc7-44fb-b75f-8fdd42749961"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.765459 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc" (OuterVolumeSpecName: "kube-api-access-w2xhc") pod "5d3abf6c-c633-4acd-b499-5caeea41e753" (UID: "5d3abf6c-c633-4acd-b499-5caeea41e753"). InnerVolumeSpecName "kube-api-access-w2xhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.765560 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d3abf6c-c633-4acd-b499-5caeea41e753" (UID: "5d3abf6c-c633-4acd-b499-5caeea41e753"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.766648 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd" (OuterVolumeSpecName: "kube-api-access-pq4xd") pod "afd2628d-7dc7-44fb-b75f-8fdd42749961" (UID: "afd2628d-7dc7-44fb-b75f-8fdd42749961"). InnerVolumeSpecName "kube-api-access-pq4xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.766780 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afd2628d-7dc7-44fb-b75f-8fdd42749961" (UID: "afd2628d-7dc7-44fb-b75f-8fdd42749961"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.859865 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.859927 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.859954 4856 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.859981 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2xhc\" (UniqueName: \"kubernetes.io/projected/5d3abf6c-c633-4acd-b499-5caeea41e753-kube-api-access-w2xhc\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.860004 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2628d-7dc7-44fb-b75f-8fdd42749961-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.860028 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d3abf6c-c633-4acd-b499-5caeea41e753-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.860048 4856 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d3abf6c-c633-4acd-b499-5caeea41e753-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.860068 4856 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2628d-7dc7-44fb-b75f-8fdd42749961-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.860088 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq4xd\" (UniqueName: \"kubernetes.io/projected/afd2628d-7dc7-44fb-b75f-8fdd42749961-kube-api-access-pq4xd\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.948939 4856 generic.go:334] "Generic (PLEG): container finished" podID="afd2628d-7dc7-44fb-b75f-8fdd42749961" containerID="46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea" exitCode=0 Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.949033 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.949039 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" event={"ID":"afd2628d-7dc7-44fb-b75f-8fdd42749961","Type":"ContainerDied","Data":"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea"} Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.949247 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs" event={"ID":"afd2628d-7dc7-44fb-b75f-8fdd42749961","Type":"ContainerDied","Data":"3628ee840fe17a9e7d6cd25fce0acc986996923332d52ae7839d33a506072852"} Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.949326 4856 scope.go:117] "RemoveContainer" containerID="46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.951832 4856 generic.go:334] "Generic (PLEG): container finished" podID="5d3abf6c-c633-4acd-b499-5caeea41e753" containerID="7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8" exitCode=0 Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.951908 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" event={"ID":"5d3abf6c-c633-4acd-b499-5caeea41e753","Type":"ContainerDied","Data":"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8"} Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.951913 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.951940 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5d7df666d7-zq2sw" event={"ID":"5d3abf6c-c633-4acd-b499-5caeea41e753","Type":"ContainerDied","Data":"1f4c077640840d05d2ab839bdf5e262ddc9c3083a95a483f6155d7c873289d8f"} Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.954261 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" event={"ID":"6f2e80da-b046-416e-9a95-1ebd9beba283","Type":"ContainerStarted","Data":"4274eb3c15290a12fffa04cfbf82a0accb0141de4d6cd3c5e0d9a5450fd45cc5"} Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.970232 4856 scope.go:117] "RemoveContainer" containerID="46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea" Mar 20 13:28:02 crc kubenswrapper[4856]: E0320 13:28:02.973090 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea\": container with ID starting with 46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea not found: ID does not exist" containerID="46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.973134 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea"} err="failed to get container status \"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea\": rpc error: code = NotFound desc = could not find container \"46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea\": container with ID starting with 46f91f9c185d57af211cfea16b9ba5193ed25ea9f9c013451112fdfc32f231ea not found: ID does not exist" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.973165 4856 scope.go:117] "RemoveContainer" containerID="7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.983129 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" podStartSLOduration=1.54316426 podStartE2EDuration="2.983092021s" podCreationTimestamp="2026-03-20 13:28:00 +0000 UTC" firstStartedPulling="2026-03-20 13:28:00.95258639 +0000 UTC m=+295.833612560" lastFinishedPulling="2026-03-20 13:28:02.392514171 +0000 UTC m=+297.273540321" observedRunningTime="2026-03-20 13:28:02.980130313 +0000 UTC m=+297.861156463" watchObservedRunningTime="2026-03-20 13:28:02.983092021 +0000 UTC m=+297.864118161" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.989763 4856 scope.go:117] "RemoveContainer" containerID="7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8" Mar 20 13:28:02 crc kubenswrapper[4856]: E0320 13:28:02.990517 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8\": container with ID starting with 7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8 not found: ID does not exist" containerID="7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8" Mar 20 13:28:02 crc kubenswrapper[4856]: I0320 13:28:02.990597 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8"} err="failed to get container status \"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8\": rpc error: code = NotFound desc = could not find container \"7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8\": container with ID starting with 7846fb643a572136335e06cc882713335987de5833232d90a8d5c96d7f6ec5b8 not found: ID does not exist" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.003379 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.007948 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dc86d4797-c9cjs"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.020601 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.024564 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5d7df666d7-zq2sw"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.274315 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78446db784-mv26w"] Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.274580 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3abf6c-c633-4acd-b499-5caeea41e753" containerName="controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.274596 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3abf6c-c633-4acd-b499-5caeea41e753" containerName="controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.274619 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd2628d-7dc7-44fb-b75f-8fdd42749961" containerName="route-controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.274628 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd2628d-7dc7-44fb-b75f-8fdd42749961" containerName="route-controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.274748 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd2628d-7dc7-44fb-b75f-8fdd42749961" containerName="route-controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.274771 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3abf6c-c633-4acd-b499-5caeea41e753" containerName="controller-manager" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.275246 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.277384 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.277644 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.277759 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5844d56456-9kt8g"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.277794 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.277912 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.278034 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.278555 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.279209 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.280261 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.280803 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.281040 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.282331 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.282484 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.282717 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.284815 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78446db784-mv26w"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.289013 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5844d56456-9kt8g"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.296651 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468444 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfpkn\" (UniqueName: \"kubernetes.io/projected/ea98bd8b-f6bf-4628-a862-e996d4ca021e-kube-api-access-cfpkn\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468545 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-client-ca\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468574 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea98bd8b-f6bf-4628-a862-e996d4ca021e-serving-cert\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468600 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-proxy-ca-bundles\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468630 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41163a2-7444-49ea-87bc-bedef51ef6ed-serving-cert\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468664 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-config\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468700 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zm7\" (UniqueName: \"kubernetes.io/projected/b41163a2-7444-49ea-87bc-bedef51ef6ed-kube-api-access-n4zm7\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468741 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-client-ca\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.468841 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-config\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570516 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-config\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570685 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfpkn\" (UniqueName: \"kubernetes.io/projected/ea98bd8b-f6bf-4628-a862-e996d4ca021e-kube-api-access-cfpkn\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570740 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-client-ca\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570769 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea98bd8b-f6bf-4628-a862-e996d4ca021e-serving-cert\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570802 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-proxy-ca-bundles\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570839 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41163a2-7444-49ea-87bc-bedef51ef6ed-serving-cert\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570875 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-config\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570912 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zm7\" (UniqueName: \"kubernetes.io/projected/b41163a2-7444-49ea-87bc-bedef51ef6ed-kube-api-access-n4zm7\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.570954 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-client-ca\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.571707 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-client-ca\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.571776 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-proxy-ca-bundles\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.572190 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea98bd8b-f6bf-4628-a862-e996d4ca021e-config\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.572916 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-client-ca\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.573571 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41163a2-7444-49ea-87bc-bedef51ef6ed-config\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.574398 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41163a2-7444-49ea-87bc-bedef51ef6ed-serving-cert\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.574507 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea98bd8b-f6bf-4628-a862-e996d4ca021e-serving-cert\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.585475 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfpkn\" (UniqueName: \"kubernetes.io/projected/ea98bd8b-f6bf-4628-a862-e996d4ca021e-kube-api-access-cfpkn\") pod \"controller-manager-5844d56456-9kt8g\" (UID: \"ea98bd8b-f6bf-4628-a862-e996d4ca021e\") " pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.588801 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zm7\" (UniqueName: \"kubernetes.io/projected/b41163a2-7444-49ea-87bc-bedef51ef6ed-kube-api-access-n4zm7\") pod \"route-controller-manager-78446db784-mv26w\" (UID: \"b41163a2-7444-49ea-87bc-bedef51ef6ed\") " pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.624695 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.633581 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.829919 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3abf6c-c633-4acd-b499-5caeea41e753" path="/var/lib/kubelet/pods/5d3abf6c-c633-4acd-b499-5caeea41e753/volumes" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.831883 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd2628d-7dc7-44fb-b75f-8fdd42749961" path="/var/lib/kubelet/pods/afd2628d-7dc7-44fb-b75f-8fdd42749961/volumes" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.861962 4856 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.863021 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.863531 4856 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.863730 4856 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.863862 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a" gracePeriod=15 Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864167 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864215 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864232 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864241 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864317 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864328 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864342 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864352 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864373 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864383 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864395 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864405 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864417 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864426 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.864438 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864453 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864600 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864619 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864634 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864646 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864660 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864675 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864690 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864849 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.864866 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.865011 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.865025 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.865118 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49" gracePeriod=15 Mar 20 13:28:03 crc kubenswrapper[4856]: E0320 13:28:03.865361 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.865375 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.868419 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef" gracePeriod=15 Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.868528 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e" gracePeriod=15 Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.868667 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca" gracePeriod=15 Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.891764 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-78446db784-mv26w"] Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.894420 4856 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.894481 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.961059 4856 generic.go:334] "Generic (PLEG): container finished" podID="6f2e80da-b046-416e-9a95-1ebd9beba283" containerID="4274eb3c15290a12fffa04cfbf82a0accb0141de4d6cd3c5e0d9a5450fd45cc5" exitCode=0 Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.961254 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" event={"ID":"6f2e80da-b046-416e-9a95-1ebd9beba283","Type":"ContainerDied","Data":"4274eb3c15290a12fffa04cfbf82a0accb0141de4d6cd3c5e0d9a5450fd45cc5"} Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.961733 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.962038 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.963680 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" event={"ID":"b41163a2-7444-49ea-87bc-bedef51ef6ed","Type":"ContainerStarted","Data":"49d976fac1262dae2a4764caf7fc9cb7bbeb815dfcd831d24f95d51028ffc1c5"} Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976501 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976582 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976607 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976626 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976649 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976675 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976710 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:03 crc kubenswrapper[4856]: I0320 13:28:03.976781 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.079934 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.079984 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080003 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080023 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080041 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080062 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080080 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080074 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080114 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080140 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080168 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080182 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080191 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.080236 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:04 crc kubenswrapper[4856]: E0320 13:28:04.224663 4856 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:28:04 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a" Netns:"/var/run/netns/5e4e44d3-9db2-4468-8583-a0cd8efec488" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:04 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:04 crc kubenswrapper[4856]: > Mar 20 13:28:04 crc kubenswrapper[4856]: E0320 13:28:04.224761 4856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:28:04 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a" Netns:"/var/run/netns/5e4e44d3-9db2-4468-8583-a0cd8efec488" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:04 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:04 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:04 crc kubenswrapper[4856]: E0320 13:28:04.224783 4856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:28:04 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a" Netns:"/var/run/netns/5e4e44d3-9db2-4468-8583-a0cd8efec488" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:04 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:04 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:04 crc kubenswrapper[4856]: E0320 13:28:04.224851 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a\\\" Netns:\\\"/var/run/netns/5e4e44d3-9db2-4468-8583-a0cd8efec488\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=b172b1b79a81fb7d22d525903b4f675ae00814e5e0d42053a4af77e6082d456a;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s\\\": dial tcp 38.102.83.192:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" podUID="ea98bd8b-f6bf-4628-a862-e996d4ca021e" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.971979 4856 generic.go:334] "Generic (PLEG): container finished" podID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" containerID="dbf3f309a3dc6f3a2b9ee37dcefa9f2f4d6b0c391feb27b9407993ee48585edd" exitCode=0 Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.972044 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffbebe8f-fc28-4542-8f04-f939ea62d4f8","Type":"ContainerDied","Data":"dbf3f309a3dc6f3a2b9ee37dcefa9f2f4d6b0c391feb27b9407993ee48585edd"} Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.973545 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.973961 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.974243 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.974587 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" event={"ID":"b41163a2-7444-49ea-87bc-bedef51ef6ed","Type":"ContainerStarted","Data":"f8ce2fce83b5e37c78e716dde24afc89735496c758860dba79fbc24586f7f314"} Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.975502 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.975653 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.975996 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.977441 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.977925 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.978968 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.980723 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981812 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49" exitCode=0 Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981841 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e" exitCode=0 Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981852 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef" exitCode=0 Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981862 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca" exitCode=2 Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981904 4856 scope.go:117] "RemoveContainer" containerID="8e2a82627c2cdbc9a6afa636d63324e0728850803161bd7f6e981e2aa079f009" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.981940 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:04 crc kubenswrapper[4856]: I0320 13:28:04.982504 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.259110 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.259836 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.260155 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.260553 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.260792 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.397464 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xswk\" (UniqueName: \"kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk\") pod \"6f2e80da-b046-416e-9a95-1ebd9beba283\" (UID: \"6f2e80da-b046-416e-9a95-1ebd9beba283\") " Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.407630 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk" (OuterVolumeSpecName: "kube-api-access-5xswk") pod "6f2e80da-b046-416e-9a95-1ebd9beba283" (UID: "6f2e80da-b046-416e-9a95-1ebd9beba283"). InnerVolumeSpecName "kube-api-access-5xswk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.499109 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xswk\" (UniqueName: \"kubernetes.io/projected/6f2e80da-b046-416e-9a95-1ebd9beba283-kube-api-access-5xswk\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:05 crc kubenswrapper[4856]: E0320 13:28:05.622006 4856 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:28:05 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c" Netns:"/var/run/netns/6e1a86ec-236c-4d9b-a4f2-9b0816d02681" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:05 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:05 crc kubenswrapper[4856]: > Mar 20 13:28:05 crc kubenswrapper[4856]: E0320 13:28:05.622087 4856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:28:05 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c" Netns:"/var/run/netns/6e1a86ec-236c-4d9b-a4f2-9b0816d02681" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:05 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:05 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:05 crc kubenswrapper[4856]: E0320 13:28:05.622108 4856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:28:05 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c" Netns:"/var/run/netns/6e1a86ec-236c-4d9b-a4f2-9b0816d02681" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:05 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:05 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:05 crc kubenswrapper[4856]: E0320 13:28:05.622205 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c\\\" Netns:\\\"/var/run/netns/6e1a86ec-236c-4d9b-a4f2-9b0816d02681\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=af566b3e0397e142a222b5b5531afc8c9bc2e9c1d104781127dc39ff0e9a597c;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s\\\": dial tcp 38.102.83.192:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" podUID="ea98bd8b-f6bf-4628-a862-e996d4ca021e" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.821784 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.822245 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.822706 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.823112 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.980432 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:05 crc kubenswrapper[4856]: I0320 13:28:05.980660 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.089669 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.092594 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" event={"ID":"6f2e80da-b046-416e-9a95-1ebd9beba283","Type":"ContainerDied","Data":"a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b"} Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.092627 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4b36c9ad3aa117ec29a91a5219ce27aef771c2163a4a9254d1f0182b162215b" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.092696 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.097854 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.098032 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.098186 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.206871 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.211187 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.211657 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.211900 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.212299 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.213427 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322612 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322724 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322747 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322882 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322901 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.322972 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.323246 4856 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.323305 4856 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.323323 4856 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.342618 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.343302 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.343641 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.344085 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.344739 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.526226 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir\") pod \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.526401 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ffbebe8f-fc28-4542-8f04-f939ea62d4f8" (UID: "ffbebe8f-fc28-4542-8f04-f939ea62d4f8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.526438 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock\") pod \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.526478 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock" (OuterVolumeSpecName: "var-lock") pod "ffbebe8f-fc28-4542-8f04-f939ea62d4f8" (UID: "ffbebe8f-fc28-4542-8f04-f939ea62d4f8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.526527 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access\") pod \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\" (UID: \"ffbebe8f-fc28-4542-8f04-f939ea62d4f8\") " Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.527129 4856 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.527167 4856 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.534447 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ffbebe8f-fc28-4542-8f04-f939ea62d4f8" (UID: "ffbebe8f-fc28-4542-8f04-f939ea62d4f8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:06 crc kubenswrapper[4856]: I0320 13:28:06.628339 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffbebe8f-fc28-4542-8f04-f939ea62d4f8-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.092719 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.092802 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.104553 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"ffbebe8f-fc28-4542-8f04-f939ea62d4f8","Type":"ContainerDied","Data":"0457b24a82a38d1ecedfb9392fa8fa0e091a952a7a2bb02f4229e89a3af8cf3a"} Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.104819 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0457b24a82a38d1ecedfb9392fa8fa0e091a952a7a2bb02f4229e89a3af8cf3a" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.104865 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.112001 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.113841 4856 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a" exitCode=0 Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.114210 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.114236 4856 scope.go:117] "RemoveContainer" containerID="43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.141602 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.142126 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.142451 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.142822 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.145239 4856 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.145946 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.146335 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.146748 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.157282 4856 scope.go:117] "RemoveContainer" containerID="a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.184957 4856 scope.go:117] "RemoveContainer" containerID="6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.211552 4856 scope.go:117] "RemoveContainer" containerID="85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.237824 4856 scope.go:117] "RemoveContainer" containerID="3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.268285 4856 scope.go:117] "RemoveContainer" containerID="7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.301904 4856 scope.go:117] "RemoveContainer" containerID="43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.302526 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\": container with ID starting with 43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49 not found: ID does not exist" containerID="43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.302553 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49"} err="failed to get container status \"43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\": rpc error: code = NotFound desc = could not find container \"43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49\": container with ID starting with 43f1d18511ebfc9615b04cd0961c2ff39d73b7405dd933e730388d102b36cd49 not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.302573 4856 scope.go:117] "RemoveContainer" containerID="a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.302962 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\": container with ID starting with a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e not found: ID does not exist" containerID="a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.303017 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e"} err="failed to get container status \"a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\": rpc error: code = NotFound desc = could not find container \"a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e\": container with ID starting with a1f1b1e6de14a6a4502f17a39083e737e737e388cee406d0460c26aee42fbc0e not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.303049 4856 scope.go:117] "RemoveContainer" containerID="6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.303699 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\": container with ID starting with 6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef not found: ID does not exist" containerID="6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.303746 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef"} err="failed to get container status \"6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\": rpc error: code = NotFound desc = could not find container \"6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef\": container with ID starting with 6645dfa309962f14cbdc83be284001f6fa13bfd93fb3247b437000603b8021ef not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.303805 4856 scope.go:117] "RemoveContainer" containerID="85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.304109 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\": container with ID starting with 85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca not found: ID does not exist" containerID="85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.304131 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca"} err="failed to get container status \"85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\": rpc error: code = NotFound desc = could not find container \"85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca\": container with ID starting with 85fdd104164dc06c3c9e87a58a5ff669563cc371466b22cf687cd6ae695d54ca not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.304143 4856 scope.go:117] "RemoveContainer" containerID="3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.304375 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\": container with ID starting with 3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a not found: ID does not exist" containerID="3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.304395 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a"} err="failed to get container status \"3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\": rpc error: code = NotFound desc = could not find container \"3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a\": container with ID starting with 3731067c0830ca71f6fb1c4bb1bef0503dbeb4b241317c841654d49f42f2158a not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.304406 4856 scope.go:117] "RemoveContainer" containerID="7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379" Mar 20 13:28:07 crc kubenswrapper[4856]: E0320 13:28:07.304627 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\": container with ID starting with 7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379 not found: ID does not exist" containerID="7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.304645 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379"} err="failed to get container status \"7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\": rpc error: code = NotFound desc = could not find container \"7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379\": container with ID starting with 7f29fe57380abe193c67f45a781ca8159c334f565dc200f747d8e70bc221e379 not found: ID does not exist" Mar 20 13:28:07 crc kubenswrapper[4856]: I0320 13:28:07.828113 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:28:08 crc kubenswrapper[4856]: I0320 13:28:08.114724 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:08 crc kubenswrapper[4856]: I0320 13:28:08.114804 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:08 crc kubenswrapper[4856]: E0320 13:28:08.907863 4856 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:08 crc kubenswrapper[4856]: I0320 13:28:08.908509 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:08 crc kubenswrapper[4856]: E0320 13:28:08.920393 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-78446db784-mv26w.189e8faebe73197c openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-78446db784-mv26w,UID:b41163a2-7444-49ea-87bc-bedef51ef6ed,APIVersion:v1,ResourceVersion:29930,FieldPath:spec.containers{route-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:28:03.912849788 +0000 UTC m=+298.793875918,LastTimestamp:2026-03-20 13:28:03.912849788 +0000 UTC m=+298.793875918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:28:08 crc kubenswrapper[4856]: W0320 13:28:08.938412 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-bd81de7dfc7852ee24e92c0e7165ed4c621d22a02b93cf82b62df723e65e7ae3 WatchSource:0}: Error finding container bd81de7dfc7852ee24e92c0e7165ed4c621d22a02b93cf82b62df723e65e7ae3: Status 404 returned error can't find the container with id bd81de7dfc7852ee24e92c0e7165ed4c621d22a02b93cf82b62df723e65e7ae3 Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.131079 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bd81de7dfc7852ee24e92c0e7165ed4c621d22a02b93cf82b62df723e65e7ae3"} Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.988467 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.988537 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.988591 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.989159 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:28:09 crc kubenswrapper[4856]: I0320 13:28:09.989244 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35" gracePeriod=600 Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.138889 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164"} Mar 20 13:28:10 crc kubenswrapper[4856]: E0320 13:28:10.140635 4856 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.140756 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.141411 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.141606 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.141671 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35" exitCode=0 Mar 20 13:28:10 crc kubenswrapper[4856]: I0320 13:28:10.141724 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35"} Mar 20 13:28:10 crc kubenswrapper[4856]: E0320 13:28:10.888702 4856 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" volumeName="registry-storage" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.153914 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379"} Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.154883 4856 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.154978 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.156691 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.157457 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.158005 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.608324 4856 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.608849 4856 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.609343 4856 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.609868 4856 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.610458 4856 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:11 crc kubenswrapper[4856]: I0320 13:28:11.610513 4856 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.610841 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="200ms" Mar 20 13:28:11 crc kubenswrapper[4856]: E0320 13:28:11.812740 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="400ms" Mar 20 13:28:12 crc kubenswrapper[4856]: E0320 13:28:12.214072 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="800ms" Mar 20 13:28:13 crc kubenswrapper[4856]: E0320 13:28:13.015788 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="1.6s" Mar 20 13:28:13 crc kubenswrapper[4856]: E0320 13:28:13.166636 4856 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.192:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-78446db784-mv26w.189e8faebe73197c openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-78446db784-mv26w,UID:b41163a2-7444-49ea-87bc-bedef51ef6ed,APIVersion:v1,ResourceVersion:29930,FieldPath:spec.containers{route-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:28:03.912849788 +0000 UTC m=+298.793875918,LastTimestamp:2026-03-20 13:28:03.912849788 +0000 UTC m=+298.793875918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:28:14 crc kubenswrapper[4856]: E0320 13:28:14.617870 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="3.2s" Mar 20 13:28:14 crc kubenswrapper[4856]: I0320 13:28:14.625615 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:14 crc kubenswrapper[4856]: I0320 13:28:14.625712 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:15 crc kubenswrapper[4856]: I0320 13:28:15.824239 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:15 crc kubenswrapper[4856]: I0320 13:28:15.825922 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:15 crc kubenswrapper[4856]: I0320 13:28:15.826230 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:15 crc kubenswrapper[4856]: I0320 13:28:15.826466 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:16 crc kubenswrapper[4856]: I0320 13:28:16.819452 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:16 crc kubenswrapper[4856]: I0320 13:28:16.820493 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.224447 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.227100 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.227247 4856 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c" exitCode=1 Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.227331 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c"} Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.228112 4856 scope.go:117] "RemoveContainer" containerID="a7868fbc516578b292caa259350094244a23843931489c38818c021e242e660c" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.229328 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.229965 4856 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.230508 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.231108 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.231852 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:17 crc kubenswrapper[4856]: E0320 13:28:17.519089 4856 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:28:17 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0" Netns:"/var/run/netns/717bcde2-ed9d-4b57-b2fe-40e0d2c0a41d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:17 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:17 crc kubenswrapper[4856]: > Mar 20 13:28:17 crc kubenswrapper[4856]: E0320 13:28:17.519390 4856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:28:17 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0" Netns:"/var/run/netns/717bcde2-ed9d-4b57-b2fe-40e0d2c0a41d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:17 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:17 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:17 crc kubenswrapper[4856]: E0320 13:28:17.519413 4856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:28:17 crc kubenswrapper[4856]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0" Netns:"/var/run/netns/717bcde2-ed9d-4b57-b2fe-40e0d2c0a41d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s": dial tcp 38.102.83.192:6443: connect: connection refused Mar 20 13:28:17 crc kubenswrapper[4856]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:28:17 crc kubenswrapper[4856]: > pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:17 crc kubenswrapper[4856]: E0320 13:28:17.519474 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-5844d56456-9kt8g_openshift-controller-manager(ea98bd8b-f6bf-4628-a862-e996d4ca021e)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-5844d56456-9kt8g_openshift-controller-manager_ea98bd8b-f6bf-4628-a862-e996d4ca021e_0(d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0): error adding pod openshift-controller-manager_controller-manager-5844d56456-9kt8g to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0\\\" Netns:\\\"/var/run/netns/717bcde2-ed9d-4b57-b2fe-40e0d2c0a41d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-5844d56456-9kt8g;K8S_POD_INFRA_CONTAINER_ID=d5ef4ee5171ed820d8e2f7f96300a16075b3389e38707e0f288f10216a27dbe0;K8S_POD_UID=ea98bd8b-f6bf-4628-a862-e996d4ca021e\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-5844d56456-9kt8g] networking: Multus: [openshift-controller-manager/controller-manager-5844d56456-9kt8g/ea98bd8b-f6bf-4628-a862-e996d4ca021e]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-5844d56456-9kt8g in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-5844d56456-9kt8g?timeout=1m0s\\\": dial tcp 38.102.83.192:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" podUID="ea98bd8b-f6bf-4628-a862-e996d4ca021e" Mar 20 13:28:17 crc kubenswrapper[4856]: E0320 13:28:17.819310 4856 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.192:6443: connect: connection refused" interval="6.4s" Mar 20 13:28:17 crc kubenswrapper[4856]: I0320 13:28:17.986861 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.240634 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.241912 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.241998 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bb0885bc28ed6eb3693a33d6fa418221fb1ce3616a9c4808f577d1e662de79f"} Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.243383 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.244057 4856 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.244796 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.245576 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.246163 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.818995 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.820239 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.820700 4856 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.821039 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.821405 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.821709 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.838176 4856 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.838222 4856 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:18 crc kubenswrapper[4856]: E0320 13:28:18.838751 4856 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:18 crc kubenswrapper[4856]: I0320 13:28:18.839852 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.252590 4856 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1da8a7d9cc809be3f354a2e4523f9302f50f15c830775bbd9c2fd48582e955a0" exitCode=0 Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.253436 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1da8a7d9cc809be3f354a2e4523f9302f50f15c830775bbd9c2fd48582e955a0"} Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.253491 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"174f79d1b9d4bc9962861cb19c8151639eb55622fe27bf9616a6b5f3b8a76a94"} Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.253898 4856 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.253921 4856 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:19 crc kubenswrapper[4856]: E0320 13:28:19.254453 4856 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.254727 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.255079 4856 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.255439 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.256186 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.258214 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.416475 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.422973 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.423929 4856 status_manager.go:851] "Failed to get status for pod" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" pod="openshift-infra/auto-csr-approver-29566888-wnkb9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-infra/pods/auto-csr-approver-29566888-wnkb9\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.424450 4856 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.424982 4856 status_manager.go:851] "Failed to get status for pod" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-dhzh4\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.425542 4856 status_manager.go:851] "Failed to get status for pod" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:19 crc kubenswrapper[4856]: I0320 13:28:19.426896 4856 status_manager.go:851] "Failed to get status for pod" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-78446db784-mv26w\": dial tcp 38.102.83.192:6443: connect: connection refused" Mar 20 13:28:20 crc kubenswrapper[4856]: I0320 13:28:20.269117 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1d810686bb52abc8278252406232e810e80bb98f881e54c388b0bcf9807424c6"} Mar 20 13:28:20 crc kubenswrapper[4856]: I0320 13:28:20.269427 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4f1eeaf70601e297549e0cf932bdafe06be2d89eba3a6fe3e1c47ee4facdafa2"} Mar 20 13:28:20 crc kubenswrapper[4856]: I0320 13:28:20.269453 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"32ee870a70b40b6d35e0a938774ae1c5ecb63860a3dc8d470c4709156c411358"} Mar 20 13:28:20 crc kubenswrapper[4856]: I0320 13:28:20.269473 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:28:20 crc kubenswrapper[4856]: I0320 13:28:20.913084 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerName="oauth-openshift" containerID="cri-o://09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49" gracePeriod=15 Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.260203 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.279494 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2ab1f42335a12d290166763c02dd83729788964a79aa892d655a2699f447c942"} Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.279555 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bc460841923ee2ea53fad24187a1ef33cb15b9ae214449acff211b58a98a4679"} Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.279718 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.279865 4856 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.279902 4856 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.281433 4856 generic.go:334] "Generic (PLEG): container finished" podID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerID="09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49" exitCode=0 Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.281499 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.281499 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" event={"ID":"798f1ea0-5ae3-41a3-b063-d7014df08ced","Type":"ContainerDied","Data":"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49"} Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.281550 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-lpbh5" event={"ID":"798f1ea0-5ae3-41a3-b063-d7014df08ced","Type":"ContainerDied","Data":"c37bb491c123491a3b579aeed3be3ddea8427919f92ed1caf05771f94fe73b63"} Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.281614 4856 scope.go:117] "RemoveContainer" containerID="09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.301700 4856 scope.go:117] "RemoveContainer" containerID="09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49" Mar 20 13:28:21 crc kubenswrapper[4856]: E0320 13:28:21.302147 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49\": container with ID starting with 09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49 not found: ID does not exist" containerID="09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.302200 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49"} err="failed to get container status \"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49\": rpc error: code = NotFound desc = could not find container \"09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49\": container with ID starting with 09ec55611566246d3cacbe28c73c65b259e81f5d41be38ea694084e89bdfad49 not found: ID does not exist" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426067 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426390 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426421 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426483 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426507 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426533 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426557 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426591 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426612 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426638 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426631 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426672 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426699 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426730 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.426759 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9zjb\" (UniqueName: \"kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb\") pod \"798f1ea0-5ae3-41a3-b063-d7014df08ced\" (UID: \"798f1ea0-5ae3-41a3-b063-d7014df08ced\") " Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.427009 4856 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.427237 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.427396 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.427836 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.427899 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.435440 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb" (OuterVolumeSpecName: "kube-api-access-w9zjb") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "kube-api-access-w9zjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.436731 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.437428 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.441594 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.442389 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.442868 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.446575 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.447010 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.447888 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "798f1ea0-5ae3-41a3-b063-d7014df08ced" (UID: "798f1ea0-5ae3-41a3-b063-d7014df08ced"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.527902 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.527949 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.527962 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.527978 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.527990 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528005 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528019 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528031 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528042 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528054 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9zjb\" (UniqueName: \"kubernetes.io/projected/798f1ea0-5ae3-41a3-b063-d7014df08ced-kube-api-access-w9zjb\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528067 4856 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/798f1ea0-5ae3-41a3-b063-d7014df08ced-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528078 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:21 crc kubenswrapper[4856]: I0320 13:28:21.528092 4856 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/798f1ea0-5ae3-41a3-b063-d7014df08ced-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:23 crc kubenswrapper[4856]: I0320 13:28:23.840978 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:23 crc kubenswrapper[4856]: I0320 13:28:23.841711 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:23 crc kubenswrapper[4856]: I0320 13:28:23.848103 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:24 crc kubenswrapper[4856]: I0320 13:28:24.626039 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:24 crc kubenswrapper[4856]: I0320 13:28:24.626119 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:26 crc kubenswrapper[4856]: I0320 13:28:26.292883 4856 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:26 crc kubenswrapper[4856]: I0320 13:28:26.325895 4856 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:26 crc kubenswrapper[4856]: I0320 13:28:26.325925 4856 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:26 crc kubenswrapper[4856]: I0320 13:28:26.334461 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:26 crc kubenswrapper[4856]: I0320 13:28:26.396488 4856 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5f11694-bdd7-494d-9ae4-ba9527ac2d1e" Mar 20 13:28:26 crc kubenswrapper[4856]: E0320 13:28:26.573168 4856 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-ocp-branding-template\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Mar 20 13:28:27 crc kubenswrapper[4856]: I0320 13:28:27.332781 4856 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:27 crc kubenswrapper[4856]: I0320 13:28:27.333133 4856 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="cbf4a95d-4868-45a5-b740-b2681c6643d6" Mar 20 13:28:27 crc kubenswrapper[4856]: I0320 13:28:27.337631 4856 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a5f11694-bdd7-494d-9ae4-ba9527ac2d1e" Mar 20 13:28:27 crc kubenswrapper[4856]: I0320 13:28:27.995005 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:28:29 crc kubenswrapper[4856]: I0320 13:28:29.819457 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:29 crc kubenswrapper[4856]: I0320 13:28:29.820002 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:30 crc kubenswrapper[4856]: W0320 13:28:30.213252 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea98bd8b_f6bf_4628_a862_e996d4ca021e.slice/crio-45df15670d44539c66f4d0d7e9e1ac3bf2a8cb26e7888bec14712466f99479c5 WatchSource:0}: Error finding container 45df15670d44539c66f4d0d7e9e1ac3bf2a8cb26e7888bec14712466f99479c5: Status 404 returned error can't find the container with id 45df15670d44539c66f4d0d7e9e1ac3bf2a8cb26e7888bec14712466f99479c5 Mar 20 13:28:30 crc kubenswrapper[4856]: I0320 13:28:30.352798 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" event={"ID":"ea98bd8b-f6bf-4628-a862-e996d4ca021e","Type":"ContainerStarted","Data":"45df15670d44539c66f4d0d7e9e1ac3bf2a8cb26e7888bec14712466f99479c5"} Mar 20 13:28:31 crc kubenswrapper[4856]: I0320 13:28:31.362070 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" event={"ID":"ea98bd8b-f6bf-4628-a862-e996d4ca021e","Type":"ContainerStarted","Data":"c00c1845530ae1ec4d619ed78e73b73775bf73a7192da7f688627295a8da4e6b"} Mar 20 13:28:31 crc kubenswrapper[4856]: I0320 13:28:31.362577 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:31 crc kubenswrapper[4856]: I0320 13:28:31.371086 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" Mar 20 13:28:34 crc kubenswrapper[4856]: I0320 13:28:34.401547 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:55148->10.217.0.67:8443: read: connection reset by peer" start-of-body= Mar 20 13:28:34 crc kubenswrapper[4856]: I0320 13:28:34.402120 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": read tcp 10.217.0.2:55148->10.217.0.67:8443: read: connection reset by peer" Mar 20 13:28:35 crc kubenswrapper[4856]: I0320 13:28:35.391148 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-78446db784-mv26w_b41163a2-7444-49ea-87bc-bedef51ef6ed/route-controller-manager/0.log" Mar 20 13:28:35 crc kubenswrapper[4856]: I0320 13:28:35.391545 4856 generic.go:334] "Generic (PLEG): container finished" podID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerID="f8ce2fce83b5e37c78e716dde24afc89735496c758860dba79fbc24586f7f314" exitCode=255 Mar 20 13:28:35 crc kubenswrapper[4856]: I0320 13:28:35.391590 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" event={"ID":"b41163a2-7444-49ea-87bc-bedef51ef6ed","Type":"ContainerDied","Data":"f8ce2fce83b5e37c78e716dde24afc89735496c758860dba79fbc24586f7f314"} Mar 20 13:28:35 crc kubenswrapper[4856]: I0320 13:28:35.392866 4856 scope.go:117] "RemoveContainer" containerID="f8ce2fce83b5e37c78e716dde24afc89735496c758860dba79fbc24586f7f314" Mar 20 13:28:35 crc kubenswrapper[4856]: I0320 13:28:35.886878 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.037312 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.096867 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.115613 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.401257 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-78446db784-mv26w_b41163a2-7444-49ea-87bc-bedef51ef6ed/route-controller-manager/0.log" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.401327 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" event={"ID":"b41163a2-7444-49ea-87bc-bedef51ef6ed","Type":"ContainerStarted","Data":"5885f7a581c5ea4e14ccea1f08a082adf4fd2e0b1c23bd1e931c76c5b63406f3"} Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.401644 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.594459 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.679856 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.805208 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:28:36 crc kubenswrapper[4856]: I0320 13:28:36.831850 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.029106 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.091917 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.177259 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.207649 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.401745 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.402118 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.439219 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.531140 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.577643 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.656549 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.856701 4856 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.857455 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5844d56456-9kt8g" podStartSLOduration=36.857430196 podStartE2EDuration="36.857430196s" podCreationTimestamp="2026-03-20 13:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:31.388232617 +0000 UTC m=+326.269258777" watchObservedRunningTime="2026-03-20 13:28:37.857430196 +0000 UTC m=+332.738456356" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.857865 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podStartSLOduration=35.857854767 podStartE2EDuration="35.857854767s" podCreationTimestamp="2026-03-20 13:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:26.385098999 +0000 UTC m=+321.266125129" watchObservedRunningTime="2026-03-20 13:28:37.857854767 +0000 UTC m=+332.738880937" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.863980 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-lpbh5"] Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.864058 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.864091 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5844d56456-9kt8g"] Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.871638 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:28:37 crc kubenswrapper[4856]: I0320 13:28:37.901567 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=11.901544741 podStartE2EDuration="11.901544741s" podCreationTimestamp="2026-03-20 13:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:37.898492958 +0000 UTC m=+332.779519148" watchObservedRunningTime="2026-03-20 13:28:37.901544741 +0000 UTC m=+332.782570891" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.295390 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.406204 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.407918 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.408074 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.565550 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.762679 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.857992 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:28:38 crc kubenswrapper[4856]: I0320 13:28:38.960585 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.023161 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.073655 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.216163 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.362069 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.501961 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.590438 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.601065 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.633452 4856 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.648238 4856 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.658043 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.743423 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.789321 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.792321 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.823982 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.828527 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" path="/var/lib/kubelet/pods/798f1ea0-5ae3-41a3-b063-d7014df08ced/volumes" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.844879 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:28:39 crc kubenswrapper[4856]: I0320 13:28:39.849015 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.025827 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.038898 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.064391 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.121940 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.189939 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.232550 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.291937 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.331098 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.361927 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.483591 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.561045 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.561327 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.600512 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.654933 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.663166 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.757003 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.770048 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.789505 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.843194 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.947259 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:28:40 crc kubenswrapper[4856]: I0320 13:28:40.999106 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.002673 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.014007 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.039424 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.054506 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.087335 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.098749 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.118692 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.174372 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.206031 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.313693 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.366144 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.391327 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.527383 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.563225 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.684383 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.703931 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.894189 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.898638 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.920560 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:28:41 crc kubenswrapper[4856]: I0320 13:28:41.923001 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.108702 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.178686 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.241211 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.245220 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.283374 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.349786 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.371171 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.424302 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.437248 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.472797 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.619919 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.681843 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.686247 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.796542 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.796969 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.874529 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:28:42 crc kubenswrapper[4856]: I0320 13:28:42.987961 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.082609 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.146694 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.155702 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.259511 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.283166 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.462988 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.506596 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.516814 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.517030 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.522608 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.582194 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.655997 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.780961 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.868524 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.871154 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.939176 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:28:43 crc kubenswrapper[4856]: I0320 13:28:43.972449 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.000213 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.170879 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.228764 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.229726 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.278265 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.278294 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.297249 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.442171 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.487354 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.519608 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.540739 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.565998 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.596453 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.596687 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.604196 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.625806 4856 patch_prober.go:28] interesting pod/route-controller-manager-78446db784-mv26w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.625905 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" podUID="b41163a2-7444-49ea-87bc-bedef51ef6ed" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.688977 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.706909 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.718946 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.733563 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.735215 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.792237 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:28:44 crc kubenswrapper[4856]: I0320 13:28:44.892209 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.355419 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.439794 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.595755 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.625391 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.662464 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.687773 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.708372 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.753393 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.787772 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.806856 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.901343 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.984243 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.993872 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:28:45 crc kubenswrapper[4856]: I0320 13:28:45.995542 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.016880 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.032857 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.078449 4856 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.283125 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.364024 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.492436 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.502839 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.557209 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.592466 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.599524 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.676985 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-6fplf"] Mar 20 13:28:46 crc kubenswrapper[4856]: E0320 13:28:46.677228 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" containerName="oc" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.677241 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" containerName="oc" Mar 20 13:28:46 crc kubenswrapper[4856]: E0320 13:28:46.677253 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" containerName="installer" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.677261 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" containerName="installer" Mar 20 13:28:46 crc kubenswrapper[4856]: E0320 13:28:46.677308 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerName="oauth-openshift" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.677317 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerName="oauth-openshift" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.682356 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" containerName="oc" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.682450 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbebe8f-fc28-4542-8f04-f939ea62d4f8" containerName="installer" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.682503 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="798f1ea0-5ae3-41a3-b063-d7014df08ced" containerName="oauth-openshift" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.683909 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.691604 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.691848 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.692021 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.692494 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.692538 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.692754 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.695910 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.699039 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.700295 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.700644 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.700980 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.701361 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.701555 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.706543 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.711991 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-6fplf"] Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.712381 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.731915 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.799101 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.843781 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.870992 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c6mq\" (UniqueName: \"kubernetes.io/projected/1e9883ac-f3ec-4007-ae60-0070f657b23c-kube-api-access-9c6mq\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871125 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871308 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871397 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871448 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871515 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-dir\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871564 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871759 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.871942 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.872020 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.872089 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.872143 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-policies\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.872207 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.887467 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.936294 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.973361 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.973782 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.973937 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.973961 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-dir\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974143 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974166 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-dir\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974195 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974339 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974475 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974505 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-policies\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974638 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974662 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c6mq\" (UniqueName: \"kubernetes.io/projected/1e9883ac-f3ec-4007-ae60-0070f657b23c-kube-api-access-9c6mq\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974687 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.974828 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.975538 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-audit-policies\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.975602 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-service-ca\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.975696 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.976507 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.981619 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.982401 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.982658 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-session\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.982752 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-router-certs\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.983630 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.983653 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-login\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.983969 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.984664 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-user-template-error\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:46 crc kubenswrapper[4856]: I0320 13:28:46.987307 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1e9883ac-f3ec-4007-ae60-0070f657b23c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.012977 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c6mq\" (UniqueName: \"kubernetes.io/projected/1e9883ac-f3ec-4007-ae60-0070f657b23c-kube-api-access-9c6mq\") pod \"oauth-openshift-68974c876c-6fplf\" (UID: \"1e9883ac-f3ec-4007-ae60-0070f657b23c\") " pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.027634 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.087631 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.170183 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.187996 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.208184 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.265640 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.328598 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.373248 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.440985 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.449891 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.478448 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-68974c876c-6fplf"] Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.516935 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:28:47 crc kubenswrapper[4856]: I0320 13:28:47.651318 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.079576 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.263872 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.305021 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.325679 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.388489 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.472036 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" event={"ID":"1e9883ac-f3ec-4007-ae60-0070f657b23c","Type":"ContainerStarted","Data":"aa002b2bb71990cd5b44d667d54d700d5ba4f334c721e7cf68b5188bf72b133a"} Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.472099 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" event={"ID":"1e9883ac-f3ec-4007-ae60-0070f657b23c","Type":"ContainerStarted","Data":"79228b8991007e304b3c9922a122d6c11d7a223dc0c90de1a4f09ec50f39c1d3"} Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.473368 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.480823 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.494080 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-68974c876c-6fplf" podStartSLOduration=53.494063502 podStartE2EDuration="53.494063502s" podCreationTimestamp="2026-03-20 13:27:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:48.493411754 +0000 UTC m=+343.374437924" watchObservedRunningTime="2026-03-20 13:28:48.494063502 +0000 UTC m=+343.375089642" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.504187 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.532432 4856 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.571048 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.622619 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.675871 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.795719 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.804158 4856 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.804392 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164" gracePeriod=5 Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.863724 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.891097 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:28:48 crc kubenswrapper[4856]: I0320 13:28:48.935881 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.008345 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.104920 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.369390 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.455640 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.639155 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.716448 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.768732 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.848307 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:28:49 crc kubenswrapper[4856]: I0320 13:28:49.903356 4856 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.002833 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.038771 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.058938 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.129440 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.206625 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.227031 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.253982 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.307386 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.387215 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.551034 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.551381 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.553316 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.832146 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.865912 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:28:50 crc kubenswrapper[4856]: I0320 13:28:50.950823 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.015338 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.017019 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.181856 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.366436 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.408980 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.457846 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.533942 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.592393 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.693377 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.715451 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.794480 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.901821 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:28:51 crc kubenswrapper[4856]: I0320 13:28:51.954082 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.041917 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.470978 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.511182 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.511470 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.530013 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:28:52 crc kubenswrapper[4856]: I0320 13:28:52.829579 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:28:53 crc kubenswrapper[4856]: I0320 13:28:53.108411 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:28:53 crc kubenswrapper[4856]: I0320 13:28:53.632505 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-78446db784-mv26w" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.407871 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.408295 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.515854 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.515940 4856 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164" exitCode=137 Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.516001 4856 scope.go:117] "RemoveContainer" containerID="d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.516052 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.544603 4856 scope.go:117] "RemoveContainer" containerID="d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164" Mar 20 13:28:54 crc kubenswrapper[4856]: E0320 13:28:54.545327 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164\": container with ID starting with d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164 not found: ID does not exist" containerID="d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.545392 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164"} err="failed to get container status \"d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164\": rpc error: code = NotFound desc = could not find container \"d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164\": container with ID starting with d20ab78b6508abea2e767585fd4f09f11f60e544dff7b470bcaa3b518a8cc164 not found: ID does not exist" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585649 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585759 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585823 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585862 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585977 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.585991 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586074 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586158 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586162 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586507 4856 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586537 4856 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586561 4856 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.586585 4856 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.601684 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:54 crc kubenswrapper[4856]: I0320 13:28:54.687986 4856 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4856]: I0320 13:28:55.829183 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:29:10 crc kubenswrapper[4856]: I0320 13:29:10.962733 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:29:13 crc kubenswrapper[4856]: I0320 13:29:13.654703 4856 generic.go:334] "Generic (PLEG): container finished" podID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerID="ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c" exitCode=0 Mar 20 13:29:13 crc kubenswrapper[4856]: I0320 13:29:13.654804 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerDied","Data":"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c"} Mar 20 13:29:13 crc kubenswrapper[4856]: I0320 13:29:13.655587 4856 scope.go:117] "RemoveContainer" containerID="ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c" Mar 20 13:29:14 crc kubenswrapper[4856]: I0320 13:29:14.664297 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerStarted","Data":"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24"} Mar 20 13:29:14 crc kubenswrapper[4856]: I0320 13:29:14.665685 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:29:14 crc kubenswrapper[4856]: I0320 13:29:14.667937 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:29:30 crc kubenswrapper[4856]: I0320 13:29:30.055959 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.146147 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566890-8ckqc"] Mar 20 13:30:00 crc kubenswrapper[4856]: E0320 13:30:00.146883 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.146899 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.147016 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.147504 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.149412 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.149824 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.150661 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq"] Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.151379 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.152016 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.152715 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.152841 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.154593 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-8ckqc"] Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.158471 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq"] Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.273850 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xlx\" (UniqueName: \"kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx\") pod \"auto-csr-approver-29566890-8ckqc\" (UID: \"3967f519-5176-4af8-8ada-a4b149713f76\") " pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.273964 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktwxc\" (UniqueName: \"kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.274018 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.274044 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.375562 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.375774 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94xlx\" (UniqueName: \"kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx\") pod \"auto-csr-approver-29566890-8ckqc\" (UID: \"3967f519-5176-4af8-8ada-a4b149713f76\") " pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.375880 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktwxc\" (UniqueName: \"kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.375955 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.376658 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.382677 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.404057 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xlx\" (UniqueName: \"kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx\") pod \"auto-csr-approver-29566890-8ckqc\" (UID: \"3967f519-5176-4af8-8ada-a4b149713f76\") " pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.406303 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktwxc\" (UniqueName: \"kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc\") pod \"collect-profiles-29566890-nsknq\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.468093 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.477214 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.721938 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-8ckqc"] Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.953675 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" event={"ID":"3967f519-5176-4af8-8ada-a4b149713f76","Type":"ContainerStarted","Data":"cbcfdeb284ee0f7620acb6484beec1efa561856fe5575f2a791210323d74af6b"} Mar 20 13:30:00 crc kubenswrapper[4856]: I0320 13:30:00.994688 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq"] Mar 20 13:30:01 crc kubenswrapper[4856]: W0320 13:30:01.000153 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c337cd_c045_4fa6_953d_e30cfa4d4ec3.slice/crio-027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead WatchSource:0}: Error finding container 027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead: Status 404 returned error can't find the container with id 027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead Mar 20 13:30:01 crc kubenswrapper[4856]: I0320 13:30:01.960825 4856 generic.go:334] "Generic (PLEG): container finished" podID="35c337cd-c045-4fa6-953d-e30cfa4d4ec3" containerID="bd3aff4f27c60982965939c2511008090f46f1e7ad95effeaf9f79288c05bd4b" exitCode=0 Mar 20 13:30:01 crc kubenswrapper[4856]: I0320 13:30:01.960942 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" event={"ID":"35c337cd-c045-4fa6-953d-e30cfa4d4ec3","Type":"ContainerDied","Data":"bd3aff4f27c60982965939c2511008090f46f1e7ad95effeaf9f79288c05bd4b"} Mar 20 13:30:01 crc kubenswrapper[4856]: I0320 13:30:01.961099 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" event={"ID":"35c337cd-c045-4fa6-953d-e30cfa4d4ec3","Type":"ContainerStarted","Data":"027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead"} Mar 20 13:30:02 crc kubenswrapper[4856]: I0320 13:30:02.972394 4856 generic.go:334] "Generic (PLEG): container finished" podID="3967f519-5176-4af8-8ada-a4b149713f76" containerID="c44f1035a3891f07596fc5566cd27ca7aac02c6d2f74ad7ba600b7649026b385" exitCode=0 Mar 20 13:30:02 crc kubenswrapper[4856]: I0320 13:30:02.972467 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" event={"ID":"3967f519-5176-4af8-8ada-a4b149713f76","Type":"ContainerDied","Data":"c44f1035a3891f07596fc5566cd27ca7aac02c6d2f74ad7ba600b7649026b385"} Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.245484 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.312163 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume\") pod \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.312356 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktwxc\" (UniqueName: \"kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc\") pod \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.312402 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume\") pod \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\" (UID: \"35c337cd-c045-4fa6-953d-e30cfa4d4ec3\") " Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.313015 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume" (OuterVolumeSpecName: "config-volume") pod "35c337cd-c045-4fa6-953d-e30cfa4d4ec3" (UID: "35c337cd-c045-4fa6-953d-e30cfa4d4ec3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.313207 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.317421 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc" (OuterVolumeSpecName: "kube-api-access-ktwxc") pod "35c337cd-c045-4fa6-953d-e30cfa4d4ec3" (UID: "35c337cd-c045-4fa6-953d-e30cfa4d4ec3"). InnerVolumeSpecName "kube-api-access-ktwxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.317488 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35c337cd-c045-4fa6-953d-e30cfa4d4ec3" (UID: "35c337cd-c045-4fa6-953d-e30cfa4d4ec3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.414834 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktwxc\" (UniqueName: \"kubernetes.io/projected/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-kube-api-access-ktwxc\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.414884 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35c337cd-c045-4fa6-953d-e30cfa4d4ec3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.983965 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.983974 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq" event={"ID":"35c337cd-c045-4fa6-953d-e30cfa4d4ec3","Type":"ContainerDied","Data":"027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead"} Mar 20 13:30:03 crc kubenswrapper[4856]: I0320 13:30:03.984040 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="027ab6710dcca05c77bb887b4d2434122f3bf88b4090e2bd3527514c137a8ead" Mar 20 13:30:04 crc kubenswrapper[4856]: I0320 13:30:04.318057 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:04 crc kubenswrapper[4856]: I0320 13:30:04.429719 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94xlx\" (UniqueName: \"kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx\") pod \"3967f519-5176-4af8-8ada-a4b149713f76\" (UID: \"3967f519-5176-4af8-8ada-a4b149713f76\") " Mar 20 13:30:04 crc kubenswrapper[4856]: I0320 13:30:04.433219 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx" (OuterVolumeSpecName: "kube-api-access-94xlx") pod "3967f519-5176-4af8-8ada-a4b149713f76" (UID: "3967f519-5176-4af8-8ada-a4b149713f76"). InnerVolumeSpecName "kube-api-access-94xlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:04 crc kubenswrapper[4856]: I0320 13:30:04.531382 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94xlx\" (UniqueName: \"kubernetes.io/projected/3967f519-5176-4af8-8ada-a4b149713f76-kube-api-access-94xlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:05 crc kubenswrapper[4856]: I0320 13:30:05.004166 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" event={"ID":"3967f519-5176-4af8-8ada-a4b149713f76","Type":"ContainerDied","Data":"cbcfdeb284ee0f7620acb6484beec1efa561856fe5575f2a791210323d74af6b"} Mar 20 13:30:05 crc kubenswrapper[4856]: I0320 13:30:05.005064 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcfdeb284ee0f7620acb6484beec1efa561856fe5575f2a791210323d74af6b" Mar 20 13:30:05 crc kubenswrapper[4856]: I0320 13:30:05.004233 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-8ckqc" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.492204 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j25cd"] Mar 20 13:30:09 crc kubenswrapper[4856]: E0320 13:30:09.493800 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3967f519-5176-4af8-8ada-a4b149713f76" containerName="oc" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.493900 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3967f519-5176-4af8-8ada-a4b149713f76" containerName="oc" Mar 20 13:30:09 crc kubenswrapper[4856]: E0320 13:30:09.494001 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35c337cd-c045-4fa6-953d-e30cfa4d4ec3" containerName="collect-profiles" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.494087 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="35c337cd-c045-4fa6-953d-e30cfa4d4ec3" containerName="collect-profiles" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.494351 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3967f519-5176-4af8-8ada-a4b149713f76" containerName="oc" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.494449 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="35c337cd-c045-4fa6-953d-e30cfa4d4ec3" containerName="collect-profiles" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.494965 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.502561 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j25cd"] Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596035 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-trusted-ca\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596091 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-registry-certificates\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596124 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596214 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e0392063-d6f5-4ad9-b66b-304622940907-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596333 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cm6l\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-kube-api-access-9cm6l\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596388 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e0392063-d6f5-4ad9-b66b-304622940907-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596412 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-registry-tls\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.596486 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-bound-sa-token\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.619693 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.698421 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-trusted-ca\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.698966 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-registry-certificates\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699084 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e0392063-d6f5-4ad9-b66b-304622940907-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699140 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cm6l\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-kube-api-access-9cm6l\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699191 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e0392063-d6f5-4ad9-b66b-304622940907-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699227 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-registry-tls\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699319 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-bound-sa-token\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.699852 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-trusted-ca\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.700158 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e0392063-d6f5-4ad9-b66b-304622940907-registry-certificates\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.700798 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e0392063-d6f5-4ad9-b66b-304622940907-ca-trust-extracted\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.707796 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e0392063-d6f5-4ad9-b66b-304622940907-installation-pull-secrets\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.707903 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-registry-tls\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.722020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-bound-sa-token\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.723464 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cm6l\" (UniqueName: \"kubernetes.io/projected/e0392063-d6f5-4ad9-b66b-304622940907-kube-api-access-9cm6l\") pod \"image-registry-66df7c8f76-j25cd\" (UID: \"e0392063-d6f5-4ad9-b66b-304622940907\") " pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:09 crc kubenswrapper[4856]: I0320 13:30:09.816819 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:10 crc kubenswrapper[4856]: I0320 13:30:10.288860 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-j25cd"] Mar 20 13:30:11 crc kubenswrapper[4856]: I0320 13:30:11.042204 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" event={"ID":"e0392063-d6f5-4ad9-b66b-304622940907","Type":"ContainerStarted","Data":"93f3ae844588b28733ee39fca4fa8ae12400f079fcf689f9765bbea64509ef4d"} Mar 20 13:30:11 crc kubenswrapper[4856]: I0320 13:30:11.042305 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" event={"ID":"e0392063-d6f5-4ad9-b66b-304622940907","Type":"ContainerStarted","Data":"cf7b3f6062fd4552ff1cfc55191279d5fc4a299f49c50ec2ba42609f0d675aa9"} Mar 20 13:30:11 crc kubenswrapper[4856]: I0320 13:30:11.042457 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:11 crc kubenswrapper[4856]: I0320 13:30:11.063482 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" podStartSLOduration=2.063459435 podStartE2EDuration="2.063459435s" podCreationTimestamp="2026-03-20 13:30:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:30:11.05917033 +0000 UTC m=+425.940196510" watchObservedRunningTime="2026-03-20 13:30:11.063459435 +0000 UTC m=+425.944485605" Mar 20 13:30:29 crc kubenswrapper[4856]: I0320 13:30:29.832301 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-j25cd" Mar 20 13:30:29 crc kubenswrapper[4856]: I0320 13:30:29.910529 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:30:39 crc kubenswrapper[4856]: I0320 13:30:39.987809 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:30:39 crc kubenswrapper[4856]: I0320 13:30:39.988203 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.697020 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.697898 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dqwl2" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="registry-server" containerID="cri-o://b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74" gracePeriod=30 Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.706532 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.706779 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-w59xx" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="registry-server" containerID="cri-o://0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc" gracePeriod=30 Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.721751 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.721961 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" containerID="cri-o://5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24" gracePeriod=30 Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.732032 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.732323 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5w74v" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="registry-server" containerID="cri-o://a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0" gracePeriod=30 Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.748840 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4z5q"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.749706 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.753406 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.753666 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tgq7q" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" containerID="cri-o://759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" gracePeriod=30 Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.770327 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4z5q"] Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.826898 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7d4\" (UniqueName: \"kubernetes.io/projected/e3871fbb-6e58-45b2-a475-a45fa18a090d-kube-api-access-5g7d4\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.826946 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.826964 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.928090 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7d4\" (UniqueName: \"kubernetes.io/projected/e3871fbb-6e58-45b2-a475-a45fa18a090d-kube-api-access-5g7d4\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.928483 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.928507 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.929819 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.938796 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e3871fbb-6e58-45b2-a475-a45fa18a090d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: I0320 13:30:44.944037 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7d4\" (UniqueName: \"kubernetes.io/projected/e3871fbb-6e58-45b2-a475-a45fa18a090d-kube-api-access-5g7d4\") pod \"marketplace-operator-79b997595-n4z5q\" (UID: \"e3871fbb-6e58-45b2-a475-a45fa18a090d\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:44 crc kubenswrapper[4856]: E0320 13:30:44.992701 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 is running failed: container process not found" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 13:30:44 crc kubenswrapper[4856]: E0320 13:30:44.993052 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 is running failed: container process not found" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 13:30:44 crc kubenswrapper[4856]: E0320 13:30:44.993405 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 is running failed: container process not found" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 13:30:44 crc kubenswrapper[4856]: E0320 13:30:44.993445 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-tgq7q" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.114528 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.123906 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.174078 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.177930 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.209809 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.214936 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250451 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca\") pod \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250732 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities\") pod \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250757 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content\") pod \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250781 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngph6\" (UniqueName: \"kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6\") pod \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\" (UID: \"d543f7e0-d967-4e5a-8cae-19da02f5a7e8\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250859 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics\") pod \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250919 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b28nm\" (UniqueName: \"kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm\") pod \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\" (UID: \"cc905509-3ed8-4b63-a120-a8c5bc8fcdba\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250941 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dddz\" (UniqueName: \"kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz\") pod \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.250969 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities\") pod \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.251005 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content\") pod \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\" (UID: \"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.259069 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities" (OuterVolumeSpecName: "utilities") pod "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" (UID: "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.259738 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz" (OuterVolumeSpecName: "kube-api-access-6dddz") pod "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" (UID: "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5"). InnerVolumeSpecName "kube-api-access-6dddz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.260088 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "cc905509-3ed8-4b63-a120-a8c5bc8fcdba" (UID: "cc905509-3ed8-4b63-a120-a8c5bc8fcdba"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.263660 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6" (OuterVolumeSpecName: "kube-api-access-ngph6") pod "d543f7e0-d967-4e5a-8cae-19da02f5a7e8" (UID: "d543f7e0-d967-4e5a-8cae-19da02f5a7e8"). InnerVolumeSpecName "kube-api-access-ngph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.267166 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "cc905509-3ed8-4b63-a120-a8c5bc8fcdba" (UID: "cc905509-3ed8-4b63-a120-a8c5bc8fcdba"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.276722 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities" (OuterVolumeSpecName: "utilities") pod "d543f7e0-d967-4e5a-8cae-19da02f5a7e8" (UID: "d543f7e0-d967-4e5a-8cae-19da02f5a7e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.282130 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm" (OuterVolumeSpecName: "kube-api-access-b28nm") pod "cc905509-3ed8-4b63-a120-a8c5bc8fcdba" (UID: "cc905509-3ed8-4b63-a120-a8c5bc8fcdba"). InnerVolumeSpecName "kube-api-access-b28nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.283624 4856 generic.go:334] "Generic (PLEG): container finished" podID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerID="b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74" exitCode=0 Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.283710 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerDied","Data":"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.283748 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dqwl2" event={"ID":"7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5","Type":"ContainerDied","Data":"41db8f10710e46e0522d6d9c97340884ab21c1b405a37bf95a4090e6fc7bc91e"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.283769 4856 scope.go:117] "RemoveContainer" containerID="b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.283880 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dqwl2" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.290529 4856 generic.go:334] "Generic (PLEG): container finished" podID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" exitCode=0 Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.290571 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerDied","Data":"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.290590 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tgq7q" event={"ID":"9e31609a-8b57-4cae-a4a7-cfe4a24e346b","Type":"ContainerDied","Data":"c0797837505fd4dfc26f2ca3af266ba77614a217b7b1dbd7ba703cf6d2d626f9"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.290650 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tgq7q" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.299303 4856 generic.go:334] "Generic (PLEG): container finished" podID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerID="a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0" exitCode=0 Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.299382 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerDied","Data":"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.299409 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5w74v" event={"ID":"d543f7e0-d967-4e5a-8cae-19da02f5a7e8","Type":"ContainerDied","Data":"b62978e2a3aa477453a0bf551139c00b031ccfbd07b55ceb4eedc09f052de8df"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.299474 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5w74v" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.308000 4856 scope.go:117] "RemoveContainer" containerID="998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.312008 4856 generic.go:334] "Generic (PLEG): container finished" podID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerID="5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24" exitCode=0 Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.312047 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.312076 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerDied","Data":"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.312100 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9fh88" event={"ID":"cc905509-3ed8-4b63-a120-a8c5bc8fcdba","Type":"ContainerDied","Data":"3e616392d2def6157d1b434a1e69132620a3fb9d628c013babbfa8d1972ae1a4"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.313047 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d543f7e0-d967-4e5a-8cae-19da02f5a7e8" (UID: "d543f7e0-d967-4e5a-8cae-19da02f5a7e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.315641 4856 generic.go:334] "Generic (PLEG): container finished" podID="b99e422b-ccde-422a-869f-7898a008a66a" containerID="0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc" exitCode=0 Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.315683 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerDied","Data":"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.315709 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-w59xx" event={"ID":"b99e422b-ccde-422a-869f-7898a008a66a","Type":"ContainerDied","Data":"3e284f68faf306d2b6e6172d5e87043bed5407c84abc123ddb02cb2a70849ec5"} Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.315762 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-w59xx" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.324759 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" (UID: "7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.336871 4856 scope.go:117] "RemoveContainer" containerID="eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.341766 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.344961 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9fh88"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352486 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmsmq\" (UniqueName: \"kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq\") pod \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352542 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk5fw\" (UniqueName: \"kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw\") pod \"b99e422b-ccde-422a-869f-7898a008a66a\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352495 4856 scope.go:117] "RemoveContainer" containerID="b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352579 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities\") pod \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352704 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content\") pod \"b99e422b-ccde-422a-869f-7898a008a66a\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352755 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content\") pod \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\" (UID: \"9e31609a-8b57-4cae-a4a7-cfe4a24e346b\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.352799 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities\") pod \"b99e422b-ccde-422a-869f-7898a008a66a\" (UID: \"b99e422b-ccde-422a-869f-7898a008a66a\") " Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.355108 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw" (OuterVolumeSpecName: "kube-api-access-lk5fw") pod "b99e422b-ccde-422a-869f-7898a008a66a" (UID: "b99e422b-ccde-422a-869f-7898a008a66a"). InnerVolumeSpecName "kube-api-access-lk5fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.355220 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74\": container with ID starting with b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74 not found: ID does not exist" containerID="b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.355251 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74"} err="failed to get container status \"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74\": rpc error: code = NotFound desc = could not find container \"b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74\": container with ID starting with b72e2c5d07b41d3325db096e08f8e299526ee5b8194aa337af6c37a6b8246b74 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.355290 4856 scope.go:117] "RemoveContainer" containerID="998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.355315 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq" (OuterVolumeSpecName: "kube-api-access-jmsmq") pod "9e31609a-8b57-4cae-a4a7-cfe4a24e346b" (UID: "9e31609a-8b57-4cae-a4a7-cfe4a24e346b"). InnerVolumeSpecName "kube-api-access-jmsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.356212 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd\": container with ID starting with 998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd not found: ID does not exist" containerID="998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.357045 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd"} err="failed to get container status \"998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd\": rpc error: code = NotFound desc = could not find container \"998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd\": container with ID starting with 998ef3eab0c9aebc52d344bad353e5618b5601cb6b85f2365998eb2b134bf4bd not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.357105 4856 scope.go:117] "RemoveContainer" containerID="eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.355016 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities" (OuterVolumeSpecName: "utilities") pod "9e31609a-8b57-4cae-a4a7-cfe4a24e346b" (UID: "9e31609a-8b57-4cae-a4a7-cfe4a24e346b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.357523 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities" (OuterVolumeSpecName: "utilities") pod "b99e422b-ccde-422a-869f-7898a008a66a" (UID: "b99e422b-ccde-422a-869f-7898a008a66a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.358158 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a\": container with ID starting with eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a not found: ID does not exist" containerID="eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.358219 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a"} err="failed to get container status \"eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a\": rpc error: code = NotFound desc = could not find container \"eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a\": container with ID starting with eecca6a0bac1e5bad25c3124457ce7276e40cb3f69676af6f82d468886fb6a1a not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.358252 4856 scope.go:117] "RemoveContainer" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366642 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk5fw\" (UniqueName: \"kubernetes.io/projected/b99e422b-ccde-422a-869f-7898a008a66a-kube-api-access-lk5fw\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366673 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366682 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366694 4856 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366702 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366710 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366718 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngph6\" (UniqueName: \"kubernetes.io/projected/d543f7e0-d967-4e5a-8cae-19da02f5a7e8-kube-api-access-ngph6\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366726 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366735 4856 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366743 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmsmq\" (UniqueName: \"kubernetes.io/projected/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-kube-api-access-jmsmq\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366751 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b28nm\" (UniqueName: \"kubernetes.io/projected/cc905509-3ed8-4b63-a120-a8c5bc8fcdba-kube-api-access-b28nm\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366760 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dddz\" (UniqueName: \"kubernetes.io/projected/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-kube-api-access-6dddz\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.366767 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.377611 4856 scope.go:117] "RemoveContainer" containerID="25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.395028 4856 scope.go:117] "RemoveContainer" containerID="e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.410388 4856 scope.go:117] "RemoveContainer" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.410838 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594\": container with ID starting with 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 not found: ID does not exist" containerID="759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.410867 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594"} err="failed to get container status \"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594\": rpc error: code = NotFound desc = could not find container \"759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594\": container with ID starting with 759b529b46211eb5966074fab4dc7ea68442971ddf71ccb945a85eecc447c594 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.410893 4856 scope.go:117] "RemoveContainer" containerID="25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.411249 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7\": container with ID starting with 25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7 not found: ID does not exist" containerID="25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.411347 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7"} err="failed to get container status \"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7\": rpc error: code = NotFound desc = could not find container \"25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7\": container with ID starting with 25b4594e649b3ddc75c2a6eded3b229a5064ba1534153da7bd1e87edeaea8fb7 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.411384 4856 scope.go:117] "RemoveContainer" containerID="e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.411810 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf\": container with ID starting with e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf not found: ID does not exist" containerID="e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.411853 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf"} err="failed to get container status \"e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf\": rpc error: code = NotFound desc = could not find container \"e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf\": container with ID starting with e29aaeeef87581f23b5d4f538e9e906c289c125b05d576f7905716c98c4b6cbf not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.411879 4856 scope.go:117] "RemoveContainer" containerID="a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.413893 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b99e422b-ccde-422a-869f-7898a008a66a" (UID: "b99e422b-ccde-422a-869f-7898a008a66a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.432196 4856 scope.go:117] "RemoveContainer" containerID="4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.445481 4856 scope.go:117] "RemoveContainer" containerID="14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.468257 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99e422b-ccde-422a-869f-7898a008a66a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.469790 4856 scope.go:117] "RemoveContainer" containerID="a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.470160 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0\": container with ID starting with a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0 not found: ID does not exist" containerID="a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.470193 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0"} err="failed to get container status \"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0\": rpc error: code = NotFound desc = could not find container \"a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0\": container with ID starting with a61a20762bad279adbbd2ce1a98a39c2ea0ebd34b35c69601b81f5f7787330a0 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.470219 4856 scope.go:117] "RemoveContainer" containerID="4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.470608 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078\": container with ID starting with 4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078 not found: ID does not exist" containerID="4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.470628 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078"} err="failed to get container status \"4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078\": rpc error: code = NotFound desc = could not find container \"4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078\": container with ID starting with 4f0cd986567fc8c31d03180c5700d7aca5a35d557929802173c5e8c0f1dc2078 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.470640 4856 scope.go:117] "RemoveContainer" containerID="14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.471451 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b\": container with ID starting with 14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b not found: ID does not exist" containerID="14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.471481 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b"} err="failed to get container status \"14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b\": rpc error: code = NotFound desc = could not find container \"14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b\": container with ID starting with 14a53da003d943de057af0b6f761614303dbd815f56577e45ab0ee8469510b3b not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.471498 4856 scope.go:117] "RemoveContainer" containerID="5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.490451 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9e31609a-8b57-4cae-a4a7-cfe4a24e346b" (UID: "9e31609a-8b57-4cae-a4a7-cfe4a24e346b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.497894 4856 scope.go:117] "RemoveContainer" containerID="ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.516719 4856 scope.go:117] "RemoveContainer" containerID="5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.517167 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24\": container with ID starting with 5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24 not found: ID does not exist" containerID="5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.517244 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24"} err="failed to get container status \"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24\": rpc error: code = NotFound desc = could not find container \"5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24\": container with ID starting with 5ee6897d8e1d39aef34c042f4aa477807ef0970420777fd4ee4f907d66795f24 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.517285 4856 scope.go:117] "RemoveContainer" containerID="ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.517778 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c\": container with ID starting with ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c not found: ID does not exist" containerID="ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.517819 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c"} err="failed to get container status \"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c\": rpc error: code = NotFound desc = could not find container \"ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c\": container with ID starting with ae2023c233ea5f3edc79939e61321ced0940d8d2daaa9aa777209749b9fd584c not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.517848 4856 scope.go:117] "RemoveContainer" containerID="0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.536097 4856 scope.go:117] "RemoveContainer" containerID="a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.552555 4856 scope.go:117] "RemoveContainer" containerID="948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.569755 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e31609a-8b57-4cae-a4a7-cfe4a24e346b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.569892 4856 scope.go:117] "RemoveContainer" containerID="0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.571086 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc\": container with ID starting with 0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc not found: ID does not exist" containerID="0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.571122 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc"} err="failed to get container status \"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc\": rpc error: code = NotFound desc = could not find container \"0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc\": container with ID starting with 0b3f3117ff3cdbfd9d2e0cfc3bfb84fd92a9bb61a4328a9ab48d443f24a8d8bc not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.571147 4856 scope.go:117] "RemoveContainer" containerID="a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.571557 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804\": container with ID starting with a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804 not found: ID does not exist" containerID="a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.571585 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804"} err="failed to get container status \"a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804\": rpc error: code = NotFound desc = could not find container \"a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804\": container with ID starting with a8944f18aed4904bb2af759d86e0e1863c20618ddec50ba086ac530702ee1804 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.571600 4856 scope.go:117] "RemoveContainer" containerID="948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74" Mar 20 13:30:45 crc kubenswrapper[4856]: E0320 13:30:45.571847 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74\": container with ID starting with 948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74 not found: ID does not exist" containerID="948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.571870 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74"} err="failed to get container status \"948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74\": rpc error: code = NotFound desc = could not find container \"948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74\": container with ID starting with 948dee9054596394b3edd31e5e18420c3ff1ff902287696921b8b70d0610ea74 not found: ID does not exist" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.593810 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4z5q"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.634762 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.638338 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dqwl2"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.642909 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.646918 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tgq7q"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.652263 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.655875 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5w74v"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.668696 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.672140 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-w59xx"] Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.827214 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" path="/var/lib/kubelet/pods/7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5/volumes" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.828044 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" path="/var/lib/kubelet/pods/9e31609a-8b57-4cae-a4a7-cfe4a24e346b/volumes" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.828813 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99e422b-ccde-422a-869f-7898a008a66a" path="/var/lib/kubelet/pods/b99e422b-ccde-422a-869f-7898a008a66a/volumes" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.830249 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" path="/var/lib/kubelet/pods/cc905509-3ed8-4b63-a120-a8c5bc8fcdba/volumes" Mar 20 13:30:45 crc kubenswrapper[4856]: I0320 13:30:45.830880 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" path="/var/lib/kubelet/pods/d543f7e0-d967-4e5a-8cae-19da02f5a7e8/volumes" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.337258 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" event={"ID":"e3871fbb-6e58-45b2-a475-a45fa18a090d","Type":"ContainerStarted","Data":"3456d415c542febfcd709163e7440909ca90b45db7e802ff6f6c58883dfd192c"} Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.337617 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" event={"ID":"e3871fbb-6e58-45b2-a475-a45fa18a090d","Type":"ContainerStarted","Data":"101784c988853e8c8aeea3e3ed8e7d85485cfce495a8ef8e49a5926878332f85"} Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.337646 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.347849 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.359361 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n4z5q" podStartSLOduration=2.359334071 podStartE2EDuration="2.359334071s" podCreationTimestamp="2026-03-20 13:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:30:46.353341073 +0000 UTC m=+461.234367263" watchObservedRunningTime="2026-03-20 13:30:46.359334071 +0000 UTC m=+461.240360261" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.917789 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hkp5x"] Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918074 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918094 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918112 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918124 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918140 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918152 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918172 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918184 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918204 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918216 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918236 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918248 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918298 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918310 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918358 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918375 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="extract-utilities" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918391 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918402 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918425 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918437 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918455 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918467 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918485 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918497 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918514 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918526 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: E0320 13:30:46.918545 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918559 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="extract-content" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918714 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e31609a-8b57-4cae-a4a7-cfe4a24e346b" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918730 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="7588fc7f-c1ed-464b-8b3c-6cf0e47f43a5" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918748 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918769 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99e422b-ccde-422a-869f-7898a008a66a" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918782 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc905509-3ed8-4b63-a120-a8c5bc8fcdba" containerName="marketplace-operator" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.918802 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d543f7e0-d967-4e5a-8cae-19da02f5a7e8" containerName="registry-server" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.920576 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.923664 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.929641 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkp5x"] Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.987703 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfrk\" (UniqueName: \"kubernetes.io/projected/c2207a1a-f126-4cb6-841e-a7904a74a7d9-kube-api-access-9tfrk\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.987794 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-utilities\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:46 crc kubenswrapper[4856]: I0320 13:30:46.987870 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-catalog-content\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.089603 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-catalog-content\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.089735 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfrk\" (UniqueName: \"kubernetes.io/projected/c2207a1a-f126-4cb6-841e-a7904a74a7d9-kube-api-access-9tfrk\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.089783 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-utilities\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.090264 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-catalog-content\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.090439 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c2207a1a-f126-4cb6-841e-a7904a74a7d9-utilities\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.106118 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.107380 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.111087 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.119043 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.119869 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfrk\" (UniqueName: \"kubernetes.io/projected/c2207a1a-f126-4cb6-841e-a7904a74a7d9-kube-api-access-9tfrk\") pod \"redhat-marketplace-hkp5x\" (UID: \"c2207a1a-f126-4cb6-841e-a7904a74a7d9\") " pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.190825 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.190883 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.190907 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58n4g\" (UniqueName: \"kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.246752 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.292374 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.292768 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.292807 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58n4g\" (UniqueName: \"kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.293201 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.293237 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.310942 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58n4g\" (UniqueName: \"kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g\") pod \"certified-operators-x87hg\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.477945 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.636367 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hkp5x"] Mar 20 13:30:47 crc kubenswrapper[4856]: W0320 13:30:47.641899 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2207a1a_f126_4cb6_841e_a7904a74a7d9.slice/crio-f24eacb7ebe6d28a58b51e8fca61146e9369433d02b8b217adc96654cbc4c6c3 WatchSource:0}: Error finding container f24eacb7ebe6d28a58b51e8fca61146e9369433d02b8b217adc96654cbc4c6c3: Status 404 returned error can't find the container with id f24eacb7ebe6d28a58b51e8fca61146e9369433d02b8b217adc96654cbc4c6c3 Mar 20 13:30:47 crc kubenswrapper[4856]: I0320 13:30:47.906977 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 13:30:47 crc kubenswrapper[4856]: W0320 13:30:47.916102 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd9f3dc_ba0f_46c3_936d_44ff5adb2357.slice/crio-c91a79769a729ced71109e017895bb7b6c28a86509f9cbe42a4c7386e57a0f4a WatchSource:0}: Error finding container c91a79769a729ced71109e017895bb7b6c28a86509f9cbe42a4c7386e57a0f4a: Status 404 returned error can't find the container with id c91a79769a729ced71109e017895bb7b6c28a86509f9cbe42a4c7386e57a0f4a Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.348370 4856 generic.go:334] "Generic (PLEG): container finished" podID="c2207a1a-f126-4cb6-841e-a7904a74a7d9" containerID="38436dd83232db6cfae3dc8880a7607823af916d69b93495e5525634eb3af575" exitCode=0 Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.348478 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkp5x" event={"ID":"c2207a1a-f126-4cb6-841e-a7904a74a7d9","Type":"ContainerDied","Data":"38436dd83232db6cfae3dc8880a7607823af916d69b93495e5525634eb3af575"} Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.348520 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkp5x" event={"ID":"c2207a1a-f126-4cb6-841e-a7904a74a7d9","Type":"ContainerStarted","Data":"f24eacb7ebe6d28a58b51e8fca61146e9369433d02b8b217adc96654cbc4c6c3"} Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.350289 4856 generic.go:334] "Generic (PLEG): container finished" podID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerID="95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9" exitCode=0 Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.350410 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerDied","Data":"95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9"} Mar 20 13:30:48 crc kubenswrapper[4856]: I0320 13:30:48.350459 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerStarted","Data":"c91a79769a729ced71109e017895bb7b6c28a86509f9cbe42a4c7386e57a0f4a"} Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.319075 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dn229"] Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.320846 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.322745 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.323146 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn229"] Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.435879 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw6hb\" (UniqueName: \"kubernetes.io/projected/028ade1f-56fe-45a4-a7ce-6d3d62e38657-kube-api-access-nw6hb\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.435924 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-utilities\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.436094 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-catalog-content\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.517386 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z6827"] Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.518885 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.520567 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6827"] Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.523627 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.536959 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw6hb\" (UniqueName: \"kubernetes.io/projected/028ade1f-56fe-45a4-a7ce-6d3d62e38657-kube-api-access-nw6hb\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.537021 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-utilities\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.537079 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-catalog-content\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.537596 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-catalog-content\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.538034 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/028ade1f-56fe-45a4-a7ce-6d3d62e38657-utilities\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.566907 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw6hb\" (UniqueName: \"kubernetes.io/projected/028ade1f-56fe-45a4-a7ce-6d3d62e38657-kube-api-access-nw6hb\") pod \"redhat-operators-dn229\" (UID: \"028ade1f-56fe-45a4-a7ce-6d3d62e38657\") " pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.638152 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-utilities\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.638332 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-catalog-content\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.638534 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7gb\" (UniqueName: \"kubernetes.io/projected/d77a2053-0327-4a08-a50a-89945990633c-kube-api-access-nq7gb\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.695356 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.744088 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-utilities\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.744154 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-catalog-content\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.744206 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7gb\" (UniqueName: \"kubernetes.io/projected/d77a2053-0327-4a08-a50a-89945990633c-kube-api-access-nq7gb\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.744791 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-utilities\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.744955 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d77a2053-0327-4a08-a50a-89945990633c-catalog-content\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.763805 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7gb\" (UniqueName: \"kubernetes.io/projected/d77a2053-0327-4a08-a50a-89945990633c-kube-api-access-nq7gb\") pod \"community-operators-z6827\" (UID: \"d77a2053-0327-4a08-a50a-89945990633c\") " pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:49 crc kubenswrapper[4856]: I0320 13:30:49.857902 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.067572 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z6827"] Mar 20 13:30:50 crc kubenswrapper[4856]: W0320 13:30:50.076014 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd77a2053_0327_4a08_a50a_89945990633c.slice/crio-3295aeb4acf900edfc1b03f5fcb6395f079450647f81b8cc8ef74101e5cb4844 WatchSource:0}: Error finding container 3295aeb4acf900edfc1b03f5fcb6395f079450647f81b8cc8ef74101e5cb4844: Status 404 returned error can't find the container with id 3295aeb4acf900edfc1b03f5fcb6395f079450647f81b8cc8ef74101e5cb4844 Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.101940 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn229"] Mar 20 13:30:50 crc kubenswrapper[4856]: W0320 13:30:50.104846 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod028ade1f_56fe_45a4_a7ce_6d3d62e38657.slice/crio-ade19b221232e2f74db5da7980cb53c469f5d1c309bf3506e3c43889814a295c WatchSource:0}: Error finding container ade19b221232e2f74db5da7980cb53c469f5d1c309bf3506e3c43889814a295c: Status 404 returned error can't find the container with id ade19b221232e2f74db5da7980cb53c469f5d1c309bf3506e3c43889814a295c Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.370326 4856 generic.go:334] "Generic (PLEG): container finished" podID="c2207a1a-f126-4cb6-841e-a7904a74a7d9" containerID="f427e888fbdcbec64ed60edf02ef40ae1180a47c55cba467edc3ee5d8950f923" exitCode=0 Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.370397 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkp5x" event={"ID":"c2207a1a-f126-4cb6-841e-a7904a74a7d9","Type":"ContainerDied","Data":"f427e888fbdcbec64ed60edf02ef40ae1180a47c55cba467edc3ee5d8950f923"} Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.376539 4856 generic.go:334] "Generic (PLEG): container finished" podID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerID="e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa" exitCode=0 Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.376592 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerDied","Data":"e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa"} Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.381225 4856 generic.go:334] "Generic (PLEG): container finished" podID="d77a2053-0327-4a08-a50a-89945990633c" containerID="6656f710836ba9c3c49ec3749ab1a6cb4bd0f6b9b0238b8d89f9b22e67733698" exitCode=0 Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.381539 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6827" event={"ID":"d77a2053-0327-4a08-a50a-89945990633c","Type":"ContainerDied","Data":"6656f710836ba9c3c49ec3749ab1a6cb4bd0f6b9b0238b8d89f9b22e67733698"} Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.381599 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6827" event={"ID":"d77a2053-0327-4a08-a50a-89945990633c","Type":"ContainerStarted","Data":"3295aeb4acf900edfc1b03f5fcb6395f079450647f81b8cc8ef74101e5cb4844"} Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.385296 4856 generic.go:334] "Generic (PLEG): container finished" podID="028ade1f-56fe-45a4-a7ce-6d3d62e38657" containerID="ed1b6001b8987d2c7f7fe9705b6d35024da8e5286cc4cdbe0fb28db239c5e07a" exitCode=0 Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.385344 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn229" event={"ID":"028ade1f-56fe-45a4-a7ce-6d3d62e38657","Type":"ContainerDied","Data":"ed1b6001b8987d2c7f7fe9705b6d35024da8e5286cc4cdbe0fb28db239c5e07a"} Mar 20 13:30:50 crc kubenswrapper[4856]: I0320 13:30:50.385381 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn229" event={"ID":"028ade1f-56fe-45a4-a7ce-6d3d62e38657","Type":"ContainerStarted","Data":"ade19b221232e2f74db5da7980cb53c469f5d1c309bf3506e3c43889814a295c"} Mar 20 13:30:51 crc kubenswrapper[4856]: I0320 13:30:51.395544 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hkp5x" event={"ID":"c2207a1a-f126-4cb6-841e-a7904a74a7d9","Type":"ContainerStarted","Data":"87813c3f3ff16e172550c874312d8e38d202b74ab13b434a50ccc3e148d8e2a3"} Mar 20 13:30:51 crc kubenswrapper[4856]: I0320 13:30:51.404071 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerStarted","Data":"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77"} Mar 20 13:30:51 crc kubenswrapper[4856]: I0320 13:30:51.418091 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hkp5x" podStartSLOduration=2.602410361 podStartE2EDuration="5.418075689s" podCreationTimestamp="2026-03-20 13:30:46 +0000 UTC" firstStartedPulling="2026-03-20 13:30:48.349689937 +0000 UTC m=+463.230716077" lastFinishedPulling="2026-03-20 13:30:51.165355275 +0000 UTC m=+466.046381405" observedRunningTime="2026-03-20 13:30:51.414183341 +0000 UTC m=+466.295209501" watchObservedRunningTime="2026-03-20 13:30:51.418075689 +0000 UTC m=+466.299101819" Mar 20 13:30:51 crc kubenswrapper[4856]: I0320 13:30:51.442673 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-x87hg" podStartSLOduration=1.9910556480000001 podStartE2EDuration="4.442655701s" podCreationTimestamp="2026-03-20 13:30:47 +0000 UTC" firstStartedPulling="2026-03-20 13:30:48.354503692 +0000 UTC m=+463.235529852" lastFinishedPulling="2026-03-20 13:30:50.806103765 +0000 UTC m=+465.687129905" observedRunningTime="2026-03-20 13:30:51.441502198 +0000 UTC m=+466.322528338" watchObservedRunningTime="2026-03-20 13:30:51.442655701 +0000 UTC m=+466.323681831" Mar 20 13:30:52 crc kubenswrapper[4856]: I0320 13:30:52.415624 4856 generic.go:334] "Generic (PLEG): container finished" podID="028ade1f-56fe-45a4-a7ce-6d3d62e38657" containerID="7e54a63822995d80dfb71cdd2b8ea96c3124c75228d02e21e38d17819884e9b1" exitCode=0 Mar 20 13:30:52 crc kubenswrapper[4856]: I0320 13:30:52.415987 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn229" event={"ID":"028ade1f-56fe-45a4-a7ce-6d3d62e38657","Type":"ContainerDied","Data":"7e54a63822995d80dfb71cdd2b8ea96c3124c75228d02e21e38d17819884e9b1"} Mar 20 13:30:53 crc kubenswrapper[4856]: I0320 13:30:53.424063 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn229" event={"ID":"028ade1f-56fe-45a4-a7ce-6d3d62e38657","Type":"ContainerStarted","Data":"cd6819704257f4d65f1972d1c268ee49e91e0bb46f94965d5a6988922a153f3e"} Mar 20 13:30:54 crc kubenswrapper[4856]: I0320 13:30:54.962987 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" podUID="479327d7-e582-4367-9f68-2f65ce5c3dfe" containerName="registry" containerID="cri-o://9c4c5b4a510f6f36ac10dd0d63c0647257078705d3165a5cc74f0a1a8a5c7f9d" gracePeriod=30 Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.452465 4856 generic.go:334] "Generic (PLEG): container finished" podID="479327d7-e582-4367-9f68-2f65ce5c3dfe" containerID="9c4c5b4a510f6f36ac10dd0d63c0647257078705d3165a5cc74f0a1a8a5c7f9d" exitCode=0 Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.452575 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" event={"ID":"479327d7-e582-4367-9f68-2f65ce5c3dfe","Type":"ContainerDied","Data":"9c4c5b4a510f6f36ac10dd0d63c0647257078705d3165a5cc74f0a1a8a5c7f9d"} Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.452750 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" event={"ID":"479327d7-e582-4367-9f68-2f65ce5c3dfe","Type":"ContainerDied","Data":"17979ceed6e61f396d185701ba7d40bf177005729a4250e7d3b863df2a47284e"} Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.452767 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17979ceed6e61f396d185701ba7d40bf177005729a4250e7d3b863df2a47284e" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.461739 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.462252 4856 generic.go:334] "Generic (PLEG): container finished" podID="d77a2053-0327-4a08-a50a-89945990633c" containerID="7b78851a999a439106208551cfb439c25423d1663acaddfd2d76fe93b3402dcf" exitCode=0 Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.462308 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6827" event={"ID":"d77a2053-0327-4a08-a50a-89945990633c","Type":"ContainerDied","Data":"7b78851a999a439106208551cfb439c25423d1663acaddfd2d76fe93b3402dcf"} Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.486733 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dn229" podStartSLOduration=3.919558012 podStartE2EDuration="6.486712613s" podCreationTimestamp="2026-03-20 13:30:49 +0000 UTC" firstStartedPulling="2026-03-20 13:30:50.386353045 +0000 UTC m=+465.267379175" lastFinishedPulling="2026-03-20 13:30:52.953507646 +0000 UTC m=+467.834533776" observedRunningTime="2026-03-20 13:30:53.453754309 +0000 UTC m=+468.334780459" watchObservedRunningTime="2026-03-20 13:30:55.486712613 +0000 UTC m=+470.367738743" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638635 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638681 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638702 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638725 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638768 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638825 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpwbd\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638843 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.638859 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token\") pod \"479327d7-e582-4367-9f68-2f65ce5c3dfe\" (UID: \"479327d7-e582-4367-9f68-2f65ce5c3dfe\") " Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.641469 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.641538 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.645593 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.646025 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.648469 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.656512 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.657446 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.657779 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd" (OuterVolumeSpecName: "kube-api-access-mpwbd") pod "479327d7-e582-4367-9f68-2f65ce5c3dfe" (UID: "479327d7-e582-4367-9f68-2f65ce5c3dfe"). InnerVolumeSpecName "kube-api-access-mpwbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740479 4856 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740509 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740519 4856 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/479327d7-e582-4367-9f68-2f65ce5c3dfe-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740529 4856 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/479327d7-e582-4367-9f68-2f65ce5c3dfe-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740539 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpwbd\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-kube-api-access-mpwbd\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740549 4856 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/479327d7-e582-4367-9f68-2f65ce5c3dfe-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:55 crc kubenswrapper[4856]: I0320 13:30:55.740556 4856 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/479327d7-e582-4367-9f68-2f65ce5c3dfe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:56 crc kubenswrapper[4856]: I0320 13:30:56.470173 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-84k9j" Mar 20 13:30:56 crc kubenswrapper[4856]: I0320 13:30:56.470179 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z6827" event={"ID":"d77a2053-0327-4a08-a50a-89945990633c","Type":"ContainerStarted","Data":"badd1b54c0dc5f33d9d285146e7ca0670a5e3bb59802f8496d2f6672a4cc76ea"} Mar 20 13:30:56 crc kubenswrapper[4856]: I0320 13:30:56.502528 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z6827" podStartSLOduration=1.989688487 podStartE2EDuration="7.502513961s" podCreationTimestamp="2026-03-20 13:30:49 +0000 UTC" firstStartedPulling="2026-03-20 13:30:50.383935827 +0000 UTC m=+465.264961957" lastFinishedPulling="2026-03-20 13:30:55.896761281 +0000 UTC m=+470.777787431" observedRunningTime="2026-03-20 13:30:56.499886807 +0000 UTC m=+471.380912937" watchObservedRunningTime="2026-03-20 13:30:56.502513961 +0000 UTC m=+471.383540091" Mar 20 13:30:56 crc kubenswrapper[4856]: I0320 13:30:56.516868 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:30:56 crc kubenswrapper[4856]: I0320 13:30:56.523553 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-84k9j"] Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.247681 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.248334 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.315180 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.478953 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.478995 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.523609 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hkp5x" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.539748 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:57 crc kubenswrapper[4856]: I0320 13:30:57.830437 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479327d7-e582-4367-9f68-2f65ce5c3dfe" path="/var/lib/kubelet/pods/479327d7-e582-4367-9f68-2f65ce5c3dfe/volumes" Mar 20 13:30:58 crc kubenswrapper[4856]: I0320 13:30:58.552804 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 13:30:59 crc kubenswrapper[4856]: I0320 13:30:59.695838 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:59 crc kubenswrapper[4856]: I0320 13:30:59.695921 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:30:59 crc kubenswrapper[4856]: I0320 13:30:59.859008 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:59 crc kubenswrapper[4856]: I0320 13:30:59.859078 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:30:59 crc kubenswrapper[4856]: I0320 13:30:59.922474 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:31:00 crc kubenswrapper[4856]: I0320 13:31:00.759465 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dn229" podUID="028ade1f-56fe-45a4-a7ce-6d3d62e38657" containerName="registry-server" probeResult="failure" output=< Mar 20 13:31:00 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:31:00 crc kubenswrapper[4856]: > Mar 20 13:31:09 crc kubenswrapper[4856]: I0320 13:31:09.748859 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:31:09 crc kubenswrapper[4856]: I0320 13:31:09.838799 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dn229" Mar 20 13:31:09 crc kubenswrapper[4856]: I0320 13:31:09.915634 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z6827" Mar 20 13:31:09 crc kubenswrapper[4856]: I0320 13:31:09.987852 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:09 crc kubenswrapper[4856]: I0320 13:31:09.987912 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:31:39 crc kubenswrapper[4856]: I0320 13:31:39.988025 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:39 crc kubenswrapper[4856]: I0320 13:31:39.988584 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:31:39 crc kubenswrapper[4856]: I0320 13:31:39.988640 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:31:39 crc kubenswrapper[4856]: I0320 13:31:39.989243 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:31:39 crc kubenswrapper[4856]: I0320 13:31:39.989330 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379" gracePeriod=600 Mar 20 13:31:40 crc kubenswrapper[4856]: I0320 13:31:40.792217 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379" exitCode=0 Mar 20 13:31:40 crc kubenswrapper[4856]: I0320 13:31:40.792343 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379"} Mar 20 13:31:40 crc kubenswrapper[4856]: I0320 13:31:40.792645 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be"} Mar 20 13:31:40 crc kubenswrapper[4856]: I0320 13:31:40.792679 4856 scope.go:117] "RemoveContainer" containerID="a1cc3d0a7221265b2421ca6e1b34bdb9eff73c75363bac7bd80b523935caab35" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.141767 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-5scpg"] Mar 20 13:32:00 crc kubenswrapper[4856]: E0320 13:32:00.142696 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479327d7-e582-4367-9f68-2f65ce5c3dfe" containerName="registry" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.142716 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="479327d7-e582-4367-9f68-2f65ce5c3dfe" containerName="registry" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.142866 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="479327d7-e582-4367-9f68-2f65ce5c3dfe" containerName="registry" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.169229 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-5scpg"] Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.169389 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.174142 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.174512 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.184190 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.328443 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mln4n\" (UniqueName: \"kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n\") pod \"auto-csr-approver-29566892-5scpg\" (UID: \"2fc9d40c-f2a7-4304-b13e-53f6e93446ae\") " pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.429529 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mln4n\" (UniqueName: \"kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n\") pod \"auto-csr-approver-29566892-5scpg\" (UID: \"2fc9d40c-f2a7-4304-b13e-53f6e93446ae\") " pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.462192 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mln4n\" (UniqueName: \"kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n\") pod \"auto-csr-approver-29566892-5scpg\" (UID: \"2fc9d40c-f2a7-4304-b13e-53f6e93446ae\") " pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.486731 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.785104 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.780791 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-5scpg"] Mar 20 13:32:00 crc kubenswrapper[4856]: I0320 13:32:00.942499 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-5scpg" event={"ID":"2fc9d40c-f2a7-4304-b13e-53f6e93446ae","Type":"ContainerStarted","Data":"6c9f9246b87e47634a11223e6582dc8d7fbaee70c4778d030a6c749ea04276a8"} Mar 20 13:32:02 crc kubenswrapper[4856]: I0320 13:32:02.957713 4856 generic.go:334] "Generic (PLEG): container finished" podID="2fc9d40c-f2a7-4304-b13e-53f6e93446ae" containerID="ef5cba9659c688772e38e670b24db43c7564df4637cba6e8a73545d54b51819b" exitCode=0 Mar 20 13:32:02 crc kubenswrapper[4856]: I0320 13:32:02.957833 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-5scpg" event={"ID":"2fc9d40c-f2a7-4304-b13e-53f6e93446ae","Type":"ContainerDied","Data":"ef5cba9659c688772e38e670b24db43c7564df4637cba6e8a73545d54b51819b"} Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.215948 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.384817 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mln4n\" (UniqueName: \"kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n\") pod \"2fc9d40c-f2a7-4304-b13e-53f6e93446ae\" (UID: \"2fc9d40c-f2a7-4304-b13e-53f6e93446ae\") " Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.393121 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n" (OuterVolumeSpecName: "kube-api-access-mln4n") pod "2fc9d40c-f2a7-4304-b13e-53f6e93446ae" (UID: "2fc9d40c-f2a7-4304-b13e-53f6e93446ae"). InnerVolumeSpecName "kube-api-access-mln4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.486748 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mln4n\" (UniqueName: \"kubernetes.io/projected/2fc9d40c-f2a7-4304-b13e-53f6e93446ae-kube-api-access-mln4n\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.972705 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-5scpg" event={"ID":"2fc9d40c-f2a7-4304-b13e-53f6e93446ae","Type":"ContainerDied","Data":"6c9f9246b87e47634a11223e6582dc8d7fbaee70c4778d030a6c749ea04276a8"} Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.972743 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c9f9246b87e47634a11223e6582dc8d7fbaee70c4778d030a6c749ea04276a8" Mar 20 13:32:04 crc kubenswrapper[4856]: I0320 13:32:04.972804 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-5scpg" Mar 20 13:32:05 crc kubenswrapper[4856]: I0320 13:32:05.282262 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-xkbwc"] Mar 20 13:32:05 crc kubenswrapper[4856]: I0320 13:32:05.286486 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-xkbwc"] Mar 20 13:32:05 crc kubenswrapper[4856]: I0320 13:32:05.833616 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46f98403-30f8-40f6-afa6-6defe5937024" path="/var/lib/kubelet/pods/46f98403-30f8-40f6-afa6-6defe5937024/volumes" Mar 20 13:33:20 crc kubenswrapper[4856]: I0320 13:33:20.822835 4856 scope.go:117] "RemoveContainer" containerID="9c4c5b4a510f6f36ac10dd0d63c0647257078705d3165a5cc74f0a1a8a5c7f9d" Mar 20 13:33:20 crc kubenswrapper[4856]: I0320 13:33:20.849026 4856 scope.go:117] "RemoveContainer" containerID="1446ec6efacaafe17866ebdba53fd523321a51c1eeaad0db53ee215eb620b072" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.152105 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-28tg6"] Mar 20 13:34:00 crc kubenswrapper[4856]: E0320 13:34:00.153599 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fc9d40c-f2a7-4304-b13e-53f6e93446ae" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.153629 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fc9d40c-f2a7-4304-b13e-53f6e93446ae" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.154111 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fc9d40c-f2a7-4304-b13e-53f6e93446ae" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.155238 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.163331 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.164036 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.165936 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.168697 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-28tg6"] Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.237947 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5tb6\" (UniqueName: \"kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6\") pod \"auto-csr-approver-29566894-28tg6\" (UID: \"6bd55705-a77e-4bc9-941b-eaf18f2fc458\") " pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.340170 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5tb6\" (UniqueName: \"kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6\") pod \"auto-csr-approver-29566894-28tg6\" (UID: \"6bd55705-a77e-4bc9-941b-eaf18f2fc458\") " pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.370880 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5tb6\" (UniqueName: \"kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6\") pod \"auto-csr-approver-29566894-28tg6\" (UID: \"6bd55705-a77e-4bc9-941b-eaf18f2fc458\") " pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.490790 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.694660 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-28tg6"] Mar 20 13:34:00 crc kubenswrapper[4856]: I0320 13:34:00.814925 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-28tg6" event={"ID":"6bd55705-a77e-4bc9-941b-eaf18f2fc458","Type":"ContainerStarted","Data":"e26ecd0fd13b03a90bd414e0ac076af74b2d512d78c0a8fcca9ef774b0a24df8"} Mar 20 13:34:03 crc kubenswrapper[4856]: I0320 13:34:03.836681 4856 generic.go:334] "Generic (PLEG): container finished" podID="6bd55705-a77e-4bc9-941b-eaf18f2fc458" containerID="7c8070ad6b67ff7d3e223a0ed884b7654c48831c61a1d12083ab32c3121e7601" exitCode=0 Mar 20 13:34:03 crc kubenswrapper[4856]: I0320 13:34:03.836757 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-28tg6" event={"ID":"6bd55705-a77e-4bc9-941b-eaf18f2fc458","Type":"ContainerDied","Data":"7c8070ad6b67ff7d3e223a0ed884b7654c48831c61a1d12083ab32c3121e7601"} Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.155170 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.208416 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5tb6\" (UniqueName: \"kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6\") pod \"6bd55705-a77e-4bc9-941b-eaf18f2fc458\" (UID: \"6bd55705-a77e-4bc9-941b-eaf18f2fc458\") " Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.217829 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6" (OuterVolumeSpecName: "kube-api-access-j5tb6") pod "6bd55705-a77e-4bc9-941b-eaf18f2fc458" (UID: "6bd55705-a77e-4bc9-941b-eaf18f2fc458"). InnerVolumeSpecName "kube-api-access-j5tb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.310426 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5tb6\" (UniqueName: \"kubernetes.io/projected/6bd55705-a77e-4bc9-941b-eaf18f2fc458-kube-api-access-j5tb6\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.852683 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-28tg6" event={"ID":"6bd55705-a77e-4bc9-941b-eaf18f2fc458","Type":"ContainerDied","Data":"e26ecd0fd13b03a90bd414e0ac076af74b2d512d78c0a8fcca9ef774b0a24df8"} Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.852718 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e26ecd0fd13b03a90bd414e0ac076af74b2d512d78c0a8fcca9ef774b0a24df8" Mar 20 13:34:05 crc kubenswrapper[4856]: I0320 13:34:05.852791 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-28tg6" Mar 20 13:34:06 crc kubenswrapper[4856]: I0320 13:34:06.243389 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-wnkb9"] Mar 20 13:34:06 crc kubenswrapper[4856]: I0320 13:34:06.249592 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-wnkb9"] Mar 20 13:34:07 crc kubenswrapper[4856]: I0320 13:34:07.832044 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f2e80da-b046-416e-9a95-1ebd9beba283" path="/var/lib/kubelet/pods/6f2e80da-b046-416e-9a95-1ebd9beba283/volumes" Mar 20 13:34:09 crc kubenswrapper[4856]: I0320 13:34:09.987896 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:09 crc kubenswrapper[4856]: I0320 13:34:09.987953 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:20 crc kubenswrapper[4856]: I0320 13:34:20.902837 4856 scope.go:117] "RemoveContainer" containerID="5bfb811dcbf7df7ee894bb7676ce502633e87593535c51987807044b32e9688c" Mar 20 13:34:20 crc kubenswrapper[4856]: I0320 13:34:20.958570 4856 scope.go:117] "RemoveContainer" containerID="4274eb3c15290a12fffa04cfbf82a0accb0141de4d6cd3c5e0d9a5450fd45cc5" Mar 20 13:34:21 crc kubenswrapper[4856]: I0320 13:34:21.010530 4856 scope.go:117] "RemoveContainer" containerID="6776f60792a1a4da0b8106c89896bf30d8c061604901f4f40a83db5e00cb9542" Mar 20 13:34:21 crc kubenswrapper[4856]: I0320 13:34:21.034040 4856 scope.go:117] "RemoveContainer" containerID="7d78c6a727eedf9dcf3c7473bba42603326d49827057f7d9fb67c5b5664e0329" Mar 20 13:34:39 crc kubenswrapper[4856]: I0320 13:34:39.987905 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:39 crc kubenswrapper[4856]: I0320 13:34:39.988635 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:09 crc kubenswrapper[4856]: I0320 13:35:09.988675 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:09 crc kubenswrapper[4856]: I0320 13:35:09.989572 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:09 crc kubenswrapper[4856]: I0320 13:35:09.989648 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:35:09 crc kubenswrapper[4856]: I0320 13:35:09.991259 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:09 crc kubenswrapper[4856]: I0320 13:35:09.991411 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be" gracePeriod=600 Mar 20 13:35:10 crc kubenswrapper[4856]: I0320 13:35:10.293986 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be" exitCode=0 Mar 20 13:35:10 crc kubenswrapper[4856]: I0320 13:35:10.294075 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be"} Mar 20 13:35:10 crc kubenswrapper[4856]: I0320 13:35:10.294449 4856 scope.go:117] "RemoveContainer" containerID="73d5e3cd3a0fc09f2b610500ea5f1a9ec2b4937905af47d367a3f527b846e379" Mar 20 13:35:11 crc kubenswrapper[4856]: I0320 13:35:11.305746 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3"} Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.139357 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-9gch9"] Mar 20 13:36:00 crc kubenswrapper[4856]: E0320 13:36:00.140201 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bd55705-a77e-4bc9-941b-eaf18f2fc458" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.140218 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bd55705-a77e-4bc9-941b-eaf18f2fc458" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.140403 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bd55705-a77e-4bc9-941b-eaf18f2fc458" containerName="oc" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.140857 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.143334 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.143563 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.143437 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.152119 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-9gch9"] Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.196492 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdjjw\" (UniqueName: \"kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw\") pod \"auto-csr-approver-29566896-9gch9\" (UID: \"eb91f950-617b-4179-b3b0-b03c74f45ce4\") " pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.297607 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdjjw\" (UniqueName: \"kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw\") pod \"auto-csr-approver-29566896-9gch9\" (UID: \"eb91f950-617b-4179-b3b0-b03c74f45ce4\") " pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.319966 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdjjw\" (UniqueName: \"kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw\") pod \"auto-csr-approver-29566896-9gch9\" (UID: \"eb91f950-617b-4179-b3b0-b03c74f45ce4\") " pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.461351 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:00 crc kubenswrapper[4856]: I0320 13:36:00.797217 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-9gch9"] Mar 20 13:36:00 crc kubenswrapper[4856]: W0320 13:36:00.801376 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb91f950_617b_4179_b3b0_b03c74f45ce4.slice/crio-f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2 WatchSource:0}: Error finding container f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2: Status 404 returned error can't find the container with id f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2 Mar 20 13:36:01 crc kubenswrapper[4856]: I0320 13:36:01.642657 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-9gch9" event={"ID":"eb91f950-617b-4179-b3b0-b03c74f45ce4","Type":"ContainerStarted","Data":"f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2"} Mar 20 13:36:02 crc kubenswrapper[4856]: I0320 13:36:02.654262 4856 generic.go:334] "Generic (PLEG): container finished" podID="eb91f950-617b-4179-b3b0-b03c74f45ce4" containerID="2e4c11104392346203a0b7b47f90c927685ddee9037e983ba4a0f25e7cadd256" exitCode=0 Mar 20 13:36:02 crc kubenswrapper[4856]: I0320 13:36:02.654435 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-9gch9" event={"ID":"eb91f950-617b-4179-b3b0-b03c74f45ce4","Type":"ContainerDied","Data":"2e4c11104392346203a0b7b47f90c927685ddee9037e983ba4a0f25e7cadd256"} Mar 20 13:36:03 crc kubenswrapper[4856]: I0320 13:36:03.983132 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.147019 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdjjw\" (UniqueName: \"kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw\") pod \"eb91f950-617b-4179-b3b0-b03c74f45ce4\" (UID: \"eb91f950-617b-4179-b3b0-b03c74f45ce4\") " Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.156565 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw" (OuterVolumeSpecName: "kube-api-access-mdjjw") pod "eb91f950-617b-4179-b3b0-b03c74f45ce4" (UID: "eb91f950-617b-4179-b3b0-b03c74f45ce4"). InnerVolumeSpecName "kube-api-access-mdjjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.249214 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdjjw\" (UniqueName: \"kubernetes.io/projected/eb91f950-617b-4179-b3b0-b03c74f45ce4-kube-api-access-mdjjw\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.668056 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-9gch9" event={"ID":"eb91f950-617b-4179-b3b0-b03c74f45ce4","Type":"ContainerDied","Data":"f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2"} Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.668099 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f31e9e0d4876c23747b8688c01dd15a128db6bd736594b9edbdb6b71673b95b2" Mar 20 13:36:04 crc kubenswrapper[4856]: I0320 13:36:04.668144 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-9gch9" Mar 20 13:36:05 crc kubenswrapper[4856]: I0320 13:36:05.057438 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-8ckqc"] Mar 20 13:36:05 crc kubenswrapper[4856]: I0320 13:36:05.061189 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-8ckqc"] Mar 20 13:36:05 crc kubenswrapper[4856]: I0320 13:36:05.836099 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3967f519-5176-4af8-8ada-a4b149713f76" path="/var/lib/kubelet/pods/3967f519-5176-4af8-8ada-a4b149713f76/volumes" Mar 20 13:36:21 crc kubenswrapper[4856]: I0320 13:36:21.113646 4856 scope.go:117] "RemoveContainer" containerID="c44f1035a3891f07596fc5566cd27ca7aac02c6d2f74ad7ba600b7649026b385" Mar 20 13:36:48 crc kubenswrapper[4856]: I0320 13:36:48.866589 4856 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:37:39 crc kubenswrapper[4856]: I0320 13:37:39.988133 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:37:39 crc kubenswrapper[4856]: I0320 13:37:39.988923 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.313824 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9njpz"] Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314551 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-controller" containerID="cri-o://f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314873 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="sbdb" containerID="cri-o://291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314910 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="nbdb" containerID="cri-o://058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314938 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="northd" containerID="cri-o://f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314963 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.314994 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-node" containerID="cri-o://0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.315022 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-acl-logging" containerID="cri-o://3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.386987 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" containerID="cri-o://fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" gracePeriod=30 Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.650997 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/3.log" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.653780 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovn-acl-logging/0.log" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.654237 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovn-controller/0.log" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.654655 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708521 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-85td9"] Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708775 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="nbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708795 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="nbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708808 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kubecfg-setup" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708817 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kubecfg-setup" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708826 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb91f950-617b-4179-b3b0-b03c74f45ce4" containerName="oc" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708833 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb91f950-617b-4179-b3b0-b03c74f45ce4" containerName="oc" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708842 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-acl-logging" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708849 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-acl-logging" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708857 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-node" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708865 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-node" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708878 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="sbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708886 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="sbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708898 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708906 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708914 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708922 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708931 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708939 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708948 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708956 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708967 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708974 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.708986 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.708995 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.709005 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="northd" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709013 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="northd" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709092 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="sbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709102 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-node" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709112 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="northd" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709121 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709129 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovn-acl-logging" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709138 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709144 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="nbdb" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709151 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709158 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709167 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709174 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb91f950-617b-4179-b3b0-b03c74f45ce4" containerName="oc" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709180 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: E0320 13:37:42.709309 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709317 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.709394 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerName="ovnkube-controller" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.710970 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746790 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746832 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746862 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746882 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746902 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746941 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746975 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.746989 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747008 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747026 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747048 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747068 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747103 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747127 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747151 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747168 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747233 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747255 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzmgq\" (UniqueName: \"kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747290 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch\") pod \"24a5ae28-8378-4545-af2d-cf1eb86364a2\" (UID: \"24a5ae28-8378-4545-af2d-cf1eb86364a2\") " Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747560 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747600 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash" (OuterVolumeSpecName: "host-slash") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747617 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.747969 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.748001 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.748498 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.748538 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749294 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749294 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749317 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket" (OuterVolumeSpecName: "log-socket") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749339 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749349 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log" (OuterVolumeSpecName: "node-log") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749367 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749448 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749522 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749643 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.749792 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.754439 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.757861 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq" (OuterVolumeSpecName: "kube-api-access-qzmgq") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "kube-api-access-qzmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.763735 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "24a5ae28-8378-4545-af2d-cf1eb86364a2" (UID: "24a5ae28-8378-4545-af2d-cf1eb86364a2"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848778 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-systemd-units\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848821 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-env-overrides\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848840 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-script-lib\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848859 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-systemd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848873 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-log-socket\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848885 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-bin\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848902 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-netd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848921 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-node-log\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848936 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-slash\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848950 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfgh\" (UniqueName: \"kubernetes.io/projected/89b85bac-6283-44ba-8b01-aa707ae861c1-kube-api-access-5vfgh\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848964 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-netns\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.848983 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-kubelet\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849002 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-ovn\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849033 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-config\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849049 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849066 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-etc-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849112 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849130 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-var-lib-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849145 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b85bac-6283-44ba-8b01-aa707ae861c1-ovn-node-metrics-cert\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849163 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849198 4856 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849209 4856 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849218 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzmgq\" (UniqueName: \"kubernetes.io/projected/24a5ae28-8378-4545-af2d-cf1eb86364a2-kube-api-access-qzmgq\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849227 4856 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849234 4856 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849243 4856 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849251 4856 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849258 4856 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849266 4856 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849291 4856 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849298 4856 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849306 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/24a5ae28-8378-4545-af2d-cf1eb86364a2-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849315 4856 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849324 4856 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849332 4856 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849340 4856 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849348 4856 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849356 4856 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849364 4856 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.849373 4856 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/24a5ae28-8378-4545-af2d-cf1eb86364a2-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950377 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-systemd-units\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950462 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-env-overrides\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950514 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-script-lib\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950568 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-systemd-units\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950600 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-log-socket\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950650 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-systemd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950698 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-bin\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950746 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-netd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950797 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-node-log\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950842 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-slash\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950888 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfgh\" (UniqueName: \"kubernetes.io/projected/89b85bac-6283-44ba-8b01-aa707ae861c1-kube-api-access-5vfgh\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950937 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-netns\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.950985 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-kubelet\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951034 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-ovn\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951221 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-config\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951250 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-node-log\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951304 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-log-socket\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951308 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951332 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-systemd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951357 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-bin\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951381 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-cni-netd\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951393 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-etc-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951409 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-ovn\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951435 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-slash\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951549 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-run-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951565 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951623 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-var-lib-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951668 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b85bac-6283-44ba-8b01-aa707ae861c1-ovn-node-metrics-cert\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951705 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-netns\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951714 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951731 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-kubelet\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951852 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-run-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951921 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951972 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-script-lib\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951226 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-env-overrides\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.951984 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-var-lib-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.952050 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b85bac-6283-44ba-8b01-aa707ae861c1-etc-openvswitch\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.952117 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b85bac-6283-44ba-8b01-aa707ae861c1-ovnkube-config\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.955959 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b85bac-6283-44ba-8b01-aa707ae861c1-ovn-node-metrics-cert\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:42 crc kubenswrapper[4856]: I0320 13:37:42.973578 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfgh\" (UniqueName: \"kubernetes.io/projected/89b85bac-6283-44ba-8b01-aa707ae861c1-kube-api-access-5vfgh\") pod \"ovnkube-node-85td9\" (UID: \"89b85bac-6283-44ba-8b01-aa707ae861c1\") " pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.028795 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.345918 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/2.log" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.347090 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/1.log" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.347147 4856 generic.go:334] "Generic (PLEG): container finished" podID="da4c21dd-2600-4141-bf05-7c18c1932a33" containerID="674d23b0d62a7fc9ce60c35c93919539056a2169216e683cc45f7588b8727351" exitCode=2 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.347223 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerDied","Data":"674d23b0d62a7fc9ce60c35c93919539056a2169216e683cc45f7588b8727351"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.347302 4856 scope.go:117] "RemoveContainer" containerID="2d5c538d2062207372f62cd98f74f337d2c14c2e6e7695e96c4d5bb3693ec608" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.348083 4856 scope.go:117] "RemoveContainer" containerID="674d23b0d62a7fc9ce60c35c93919539056a2169216e683cc45f7588b8727351" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.351660 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovnkube-controller/3.log" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.354123 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovn-acl-logging/0.log" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.354722 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9njpz_24a5ae28-8378-4545-af2d-cf1eb86364a2/ovn-controller/0.log" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355705 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355726 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355733 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355739 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355747 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355754 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355761 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" exitCode=143 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355767 4856 generic.go:334] "Generic (PLEG): container finished" podID="24a5ae28-8378-4545-af2d-cf1eb86364a2" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" exitCode=143 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355808 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355878 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355894 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355904 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355912 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355921 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355931 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355940 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355946 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355952 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355958 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355963 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355968 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355973 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355978 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355984 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355990 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.355997 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356003 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356009 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356014 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356019 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356024 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356029 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356034 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356039 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356043 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356050 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356057 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356064 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356069 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356075 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356081 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356088 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356096 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356102 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356107 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356113 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356120 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" event={"ID":"24a5ae28-8378-4545-af2d-cf1eb86364a2","Type":"ContainerDied","Data":"eca97356b8fc80daef2b879effb2722a2305fc17f0b77afa02f334ce7f97f82d"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356127 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356133 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356139 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356143 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356148 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356153 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356158 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356163 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356168 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356172 4856 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.356249 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9njpz" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.362086 4856 generic.go:334] "Generic (PLEG): container finished" podID="89b85bac-6283-44ba-8b01-aa707ae861c1" containerID="e492b3c6d4514860cb63d08de00b261a372dc784d56c8489675fcbc047ab2ab5" exitCode=0 Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.362201 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerDied","Data":"e492b3c6d4514860cb63d08de00b261a372dc784d56c8489675fcbc047ab2ab5"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.362252 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"cc1056c6a5d29f5f70e6a61b1b5fc3df3e50d7c90c12e73f9d411b8a8618e98c"} Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.404411 4856 scope.go:117] "RemoveContainer" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.428482 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9njpz"] Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.433561 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9njpz"] Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.446053 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.526102 4856 scope.go:117] "RemoveContainer" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.548415 4856 scope.go:117] "RemoveContainer" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.571245 4856 scope.go:117] "RemoveContainer" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.592592 4856 scope.go:117] "RemoveContainer" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.615042 4856 scope.go:117] "RemoveContainer" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.667992 4856 scope.go:117] "RemoveContainer" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.689805 4856 scope.go:117] "RemoveContainer" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.715085 4856 scope.go:117] "RemoveContainer" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.733738 4856 scope.go:117] "RemoveContainer" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.735794 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": container with ID starting with fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327 not found: ID does not exist" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.735861 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} err="failed to get container status \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": rpc error: code = NotFound desc = could not find container \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": container with ID starting with fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.735908 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.736444 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": container with ID starting with bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78 not found: ID does not exist" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.736495 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} err="failed to get container status \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": rpc error: code = NotFound desc = could not find container \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": container with ID starting with bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.736525 4856 scope.go:117] "RemoveContainer" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.737442 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": container with ID starting with 291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224 not found: ID does not exist" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.737487 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} err="failed to get container status \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": rpc error: code = NotFound desc = could not find container \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": container with ID starting with 291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.737509 4856 scope.go:117] "RemoveContainer" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.738871 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": container with ID starting with 058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5 not found: ID does not exist" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.738907 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} err="failed to get container status \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": rpc error: code = NotFound desc = could not find container \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": container with ID starting with 058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.738964 4856 scope.go:117] "RemoveContainer" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.740376 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": container with ID starting with f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f not found: ID does not exist" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.740436 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} err="failed to get container status \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": rpc error: code = NotFound desc = could not find container \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": container with ID starting with f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.740468 4856 scope.go:117] "RemoveContainer" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.742408 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": container with ID starting with 3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6 not found: ID does not exist" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.742430 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} err="failed to get container status \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": rpc error: code = NotFound desc = could not find container \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": container with ID starting with 3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.742446 4856 scope.go:117] "RemoveContainer" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.742712 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": container with ID starting with 0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd not found: ID does not exist" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.742737 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} err="failed to get container status \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": rpc error: code = NotFound desc = could not find container \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": container with ID starting with 0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.742754 4856 scope.go:117] "RemoveContainer" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.743111 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": container with ID starting with 3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352 not found: ID does not exist" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743136 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} err="failed to get container status \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": rpc error: code = NotFound desc = could not find container \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": container with ID starting with 3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743153 4856 scope.go:117] "RemoveContainer" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.743422 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": container with ID starting with f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635 not found: ID does not exist" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743443 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} err="failed to get container status \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": rpc error: code = NotFound desc = could not find container \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": container with ID starting with f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743461 4856 scope.go:117] "RemoveContainer" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: E0320 13:37:43.743723 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": container with ID starting with 918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f not found: ID does not exist" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743745 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} err="failed to get container status \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": rpc error: code = NotFound desc = could not find container \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": container with ID starting with 918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.743765 4856 scope.go:117] "RemoveContainer" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.744609 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} err="failed to get container status \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": rpc error: code = NotFound desc = could not find container \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": container with ID starting with fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.744633 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.746493 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} err="failed to get container status \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": rpc error: code = NotFound desc = could not find container \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": container with ID starting with bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.746516 4856 scope.go:117] "RemoveContainer" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.749752 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} err="failed to get container status \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": rpc error: code = NotFound desc = could not find container \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": container with ID starting with 291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.749786 4856 scope.go:117] "RemoveContainer" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750467 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} err="failed to get container status \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": rpc error: code = NotFound desc = could not find container \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": container with ID starting with 058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750487 4856 scope.go:117] "RemoveContainer" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750705 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} err="failed to get container status \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": rpc error: code = NotFound desc = could not find container \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": container with ID starting with f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750724 4856 scope.go:117] "RemoveContainer" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750961 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} err="failed to get container status \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": rpc error: code = NotFound desc = could not find container \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": container with ID starting with 3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.750982 4856 scope.go:117] "RemoveContainer" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.751285 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} err="failed to get container status \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": rpc error: code = NotFound desc = could not find container \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": container with ID starting with 0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.751309 4856 scope.go:117] "RemoveContainer" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.751740 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} err="failed to get container status \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": rpc error: code = NotFound desc = could not find container \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": container with ID starting with 3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.751764 4856 scope.go:117] "RemoveContainer" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752011 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} err="failed to get container status \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": rpc error: code = NotFound desc = could not find container \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": container with ID starting with f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752030 4856 scope.go:117] "RemoveContainer" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752293 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} err="failed to get container status \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": rpc error: code = NotFound desc = could not find container \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": container with ID starting with 918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752313 4856 scope.go:117] "RemoveContainer" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752515 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} err="failed to get container status \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": rpc error: code = NotFound desc = could not find container \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": container with ID starting with fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752533 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752951 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} err="failed to get container status \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": rpc error: code = NotFound desc = could not find container \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": container with ID starting with bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.752970 4856 scope.go:117] "RemoveContainer" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753204 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} err="failed to get container status \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": rpc error: code = NotFound desc = could not find container \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": container with ID starting with 291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753222 4856 scope.go:117] "RemoveContainer" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753547 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} err="failed to get container status \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": rpc error: code = NotFound desc = could not find container \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": container with ID starting with 058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753565 4856 scope.go:117] "RemoveContainer" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753792 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} err="failed to get container status \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": rpc error: code = NotFound desc = could not find container \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": container with ID starting with f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.753810 4856 scope.go:117] "RemoveContainer" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.754058 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} err="failed to get container status \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": rpc error: code = NotFound desc = could not find container \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": container with ID starting with 3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.754076 4856 scope.go:117] "RemoveContainer" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.754330 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} err="failed to get container status \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": rpc error: code = NotFound desc = could not find container \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": container with ID starting with 0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.754365 4856 scope.go:117] "RemoveContainer" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.755260 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} err="failed to get container status \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": rpc error: code = NotFound desc = could not find container \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": container with ID starting with 3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.755305 4856 scope.go:117] "RemoveContainer" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.755553 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} err="failed to get container status \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": rpc error: code = NotFound desc = could not find container \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": container with ID starting with f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.755620 4856 scope.go:117] "RemoveContainer" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.755988 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} err="failed to get container status \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": rpc error: code = NotFound desc = could not find container \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": container with ID starting with 918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756007 4856 scope.go:117] "RemoveContainer" containerID="fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756375 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327"} err="failed to get container status \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": rpc error: code = NotFound desc = could not find container \"fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327\": container with ID starting with fa651a250bd91fce74df547463e85020619521a25c6c9c377e4bc03b71455327 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756398 4856 scope.go:117] "RemoveContainer" containerID="bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756726 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78"} err="failed to get container status \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": rpc error: code = NotFound desc = could not find container \"bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78\": container with ID starting with bd0ac408bae66c719f1e7d71f3a2d10c33d238a501e779115ca369c703afed78 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756746 4856 scope.go:117] "RemoveContainer" containerID="291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756980 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224"} err="failed to get container status \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": rpc error: code = NotFound desc = could not find container \"291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224\": container with ID starting with 291f0ae9a285825f3b08f76e48ea3476c4e6e12074da84a4c16d26045a6c4224 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.756999 4856 scope.go:117] "RemoveContainer" containerID="058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757242 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5"} err="failed to get container status \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": rpc error: code = NotFound desc = could not find container \"058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5\": container with ID starting with 058f08b8a9652b70a0fc0bb8ee25669b27f4a6d455844458d6957730569fe0d5 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757262 4856 scope.go:117] "RemoveContainer" containerID="f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757537 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f"} err="failed to get container status \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": rpc error: code = NotFound desc = could not find container \"f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f\": container with ID starting with f6f1cdc671bb4f93d38ba59845ef8083ce2e8a6065a0f709e22a2e311274ce2f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757555 4856 scope.go:117] "RemoveContainer" containerID="3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757773 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6"} err="failed to get container status \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": rpc error: code = NotFound desc = could not find container \"3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6\": container with ID starting with 3ebd8160e7e508c96a2a9b8c76790f6ef59fbdcc8772a99026865e117d316de6 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757794 4856 scope.go:117] "RemoveContainer" containerID="0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.757999 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd"} err="failed to get container status \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": rpc error: code = NotFound desc = could not find container \"0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd\": container with ID starting with 0bc37ce27808cce1e74c8b626b20bdb0dedcf36708c96db7e91fa1f240b389bd not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758019 4856 scope.go:117] "RemoveContainer" containerID="3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758244 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352"} err="failed to get container status \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": rpc error: code = NotFound desc = could not find container \"3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352\": container with ID starting with 3e64c0e323b4575fa7dc48914bfeed8d4e00db5ce2ae9d8a9cc328b2a4949352 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758262 4856 scope.go:117] "RemoveContainer" containerID="f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758490 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635"} err="failed to get container status \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": rpc error: code = NotFound desc = could not find container \"f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635\": container with ID starting with f147761887e99cd69ba0ba80e8b7cd7509619987cf4ca40a7895998c00fca635 not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758511 4856 scope.go:117] "RemoveContainer" containerID="918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.758715 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f"} err="failed to get container status \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": rpc error: code = NotFound desc = could not find container \"918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f\": container with ID starting with 918a6cea69a0156079e3184fd1450bdccc427706f4d244f33cef54f13698952f not found: ID does not exist" Mar 20 13:37:43 crc kubenswrapper[4856]: I0320 13:37:43.826209 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a5ae28-8378-4545-af2d-cf1eb86364a2" path="/var/lib/kubelet/pods/24a5ae28-8378-4545-af2d-cf1eb86364a2/volumes" Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.370452 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chwcj_da4c21dd-2600-4141-bf05-7c18c1932a33/kube-multus/2.log" Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.370782 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chwcj" event={"ID":"da4c21dd-2600-4141-bf05-7c18c1932a33","Type":"ContainerStarted","Data":"fea608ae5ccc49b64bb97294d45e789059bd2b6f306039e8152456cbfe5c7ca4"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.376930 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"f5ef00de83c7dfefb6492c5c0d55088aa5cd9f29e4c9c46abc7becb6fcfc9592"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.376965 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"82675ce652baee12a3a3051e2dfe65cd0dd70d6f2e5311192f9677ddbcb2d9ad"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.376981 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"1580ccc815ae49703addef06e00dcff18e639d958efa7776344a3e36682bf8aa"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.376992 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"b21438172365f6ada351ae61a1da56362e1ff34e89e39039a2a148a485a0fda2"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.377003 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"d94f3ca98056750f163bc4aa5b489aa3bf88a33e3fc3658ab4de1722e8f92420"} Mar 20 13:37:44 crc kubenswrapper[4856]: I0320 13:37:44.377015 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"80855f1e98552041060bb90bfc5d8eb2a0d2d21c2c66c3cdf35765510be36f07"} Mar 20 13:37:47 crc kubenswrapper[4856]: I0320 13:37:47.403090 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"1516dc2c1a855c5a77149c80f2d3c84ebeb28eab0650eea5eb35e37808f603fe"} Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.420019 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" event={"ID":"89b85bac-6283-44ba-8b01-aa707ae861c1","Type":"ContainerStarted","Data":"8c662e6081c302f4f2ed961be1d24a259406c353176907bf544c7db2fbf9964f"} Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.420433 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.420449 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.420464 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.449052 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.455659 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" podStartSLOduration=7.455644284 podStartE2EDuration="7.455644284s" podCreationTimestamp="2026-03-20 13:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:49.455027076 +0000 UTC m=+884.336053216" watchObservedRunningTime="2026-03-20 13:37:49.455644284 +0000 UTC m=+884.336670404" Mar 20 13:37:49 crc kubenswrapper[4856]: I0320 13:37:49.459247 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.145123 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-n8c5c"] Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.147313 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.149736 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.149951 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.150073 4856 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6fh9x" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.150260 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.154043 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-n8c5c"] Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.209775 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.209839 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcq6p\" (UniqueName: \"kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.209877 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.311440 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.311660 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcq6p\" (UniqueName: \"kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.311739 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.312098 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.312630 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.338215 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcq6p\" (UniqueName: \"kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p\") pod \"crc-storage-crc-n8c5c\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: I0320 13:37:50.477163 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: E0320 13:37:50.517376 4856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(cdcdf8555bdb5b0abeb47bc7de540dc8efc9bde7139895d7948e1d219aa9875f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:37:50 crc kubenswrapper[4856]: E0320 13:37:50.517454 4856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(cdcdf8555bdb5b0abeb47bc7de540dc8efc9bde7139895d7948e1d219aa9875f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: E0320 13:37:50.517488 4856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(cdcdf8555bdb5b0abeb47bc7de540dc8efc9bde7139895d7948e1d219aa9875f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:50 crc kubenswrapper[4856]: E0320 13:37:50.517555 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-n8c5c_crc-storage(d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-n8c5c_crc-storage(d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(cdcdf8555bdb5b0abeb47bc7de540dc8efc9bde7139895d7948e1d219aa9875f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-n8c5c" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" Mar 20 13:37:51 crc kubenswrapper[4856]: I0320 13:37:51.432020 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:51 crc kubenswrapper[4856]: I0320 13:37:51.432779 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:51 crc kubenswrapper[4856]: E0320 13:37:51.467730 4856 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(14e873eaa166fb68626391fa83ce7fcd1f79e6220fed0c7ddcf451f152f9225d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:37:51 crc kubenswrapper[4856]: E0320 13:37:51.467804 4856 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(14e873eaa166fb68626391fa83ce7fcd1f79e6220fed0c7ddcf451f152f9225d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:51 crc kubenswrapper[4856]: E0320 13:37:51.467830 4856 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(14e873eaa166fb68626391fa83ce7fcd1f79e6220fed0c7ddcf451f152f9225d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:37:51 crc kubenswrapper[4856]: E0320 13:37:51.467879 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-n8c5c_crc-storage(d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-n8c5c_crc-storage(d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-n8c5c_crc-storage_d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7_0(14e873eaa166fb68626391fa83ce7fcd1f79e6220fed0c7ddcf451f152f9225d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-n8c5c" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.147435 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-226wk"] Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.149187 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.152415 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.154505 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.157003 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-226wk"] Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.160578 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.164192 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5542\" (UniqueName: \"kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542\") pod \"auto-csr-approver-29566898-226wk\" (UID: \"b8408c48-989f-4f69-b388-c8d1d0e2e8ea\") " pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.266105 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5542\" (UniqueName: \"kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542\") pod \"auto-csr-approver-29566898-226wk\" (UID: \"b8408c48-989f-4f69-b388-c8d1d0e2e8ea\") " pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.292926 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5542\" (UniqueName: \"kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542\") pod \"auto-csr-approver-29566898-226wk\" (UID: \"b8408c48-989f-4f69-b388-c8d1d0e2e8ea\") " pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.478002 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.732002 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-226wk"] Mar 20 13:38:00 crc kubenswrapper[4856]: I0320 13:38:00.743222 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:38:01 crc kubenswrapper[4856]: I0320 13:38:01.496902 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-226wk" event={"ID":"b8408c48-989f-4f69-b388-c8d1d0e2e8ea","Type":"ContainerStarted","Data":"ab1ac0adc410454160f98faabf16cd8b01d13f00622fa3eb87ba2ec7e0036e57"} Mar 20 13:38:02 crc kubenswrapper[4856]: I0320 13:38:02.507497 4856 generic.go:334] "Generic (PLEG): container finished" podID="b8408c48-989f-4f69-b388-c8d1d0e2e8ea" containerID="c5d04cfd30d25aaf9a98a7aa45f4088204b21ee4445209e91e098b48ea2f7729" exitCode=0 Mar 20 13:38:02 crc kubenswrapper[4856]: I0320 13:38:02.507576 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-226wk" event={"ID":"b8408c48-989f-4f69-b388-c8d1d0e2e8ea","Type":"ContainerDied","Data":"c5d04cfd30d25aaf9a98a7aa45f4088204b21ee4445209e91e098b48ea2f7729"} Mar 20 13:38:03 crc kubenswrapper[4856]: I0320 13:38:03.785659 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:03 crc kubenswrapper[4856]: I0320 13:38:03.819411 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:38:03 crc kubenswrapper[4856]: I0320 13:38:03.819940 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:38:03 crc kubenswrapper[4856]: I0320 13:38:03.934357 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5542\" (UniqueName: \"kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542\") pod \"b8408c48-989f-4f69-b388-c8d1d0e2e8ea\" (UID: \"b8408c48-989f-4f69-b388-c8d1d0e2e8ea\") " Mar 20 13:38:03 crc kubenswrapper[4856]: I0320 13:38:03.939854 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542" (OuterVolumeSpecName: "kube-api-access-w5542") pod "b8408c48-989f-4f69-b388-c8d1d0e2e8ea" (UID: "b8408c48-989f-4f69-b388-c8d1d0e2e8ea"). InnerVolumeSpecName "kube-api-access-w5542". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.036014 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5542\" (UniqueName: \"kubernetes.io/projected/b8408c48-989f-4f69-b388-c8d1d0e2e8ea-kube-api-access-w5542\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.050136 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-n8c5c"] Mar 20 13:38:04 crc kubenswrapper[4856]: W0320 13:38:04.060993 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd50d0d3b_ef22_4f84_9d11_e648ccc4b7e7.slice/crio-c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f WatchSource:0}: Error finding container c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f: Status 404 returned error can't find the container with id c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.520803 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-226wk" event={"ID":"b8408c48-989f-4f69-b388-c8d1d0e2e8ea","Type":"ContainerDied","Data":"ab1ac0adc410454160f98faabf16cd8b01d13f00622fa3eb87ba2ec7e0036e57"} Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.520849 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab1ac0adc410454160f98faabf16cd8b01d13f00622fa3eb87ba2ec7e0036e57" Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.520865 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-226wk" Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.522304 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n8c5c" event={"ID":"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7","Type":"ContainerStarted","Data":"c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f"} Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.836947 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-5scpg"] Mar 20 13:38:04 crc kubenswrapper[4856]: I0320 13:38:04.840068 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-5scpg"] Mar 20 13:38:05 crc kubenswrapper[4856]: I0320 13:38:05.529808 4856 generic.go:334] "Generic (PLEG): container finished" podID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" containerID="2c182ab18c50fa8f22b8c3303f1801993710fd7b51c6d0c33910504a413a9c00" exitCode=0 Mar 20 13:38:05 crc kubenswrapper[4856]: I0320 13:38:05.529915 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n8c5c" event={"ID":"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7","Type":"ContainerDied","Data":"2c182ab18c50fa8f22b8c3303f1801993710fd7b51c6d0c33910504a413a9c00"} Mar 20 13:38:05 crc kubenswrapper[4856]: I0320 13:38:05.831136 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fc9d40c-f2a7-4304-b13e-53f6e93446ae" path="/var/lib/kubelet/pods/2fc9d40c-f2a7-4304-b13e-53f6e93446ae/volumes" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.778150 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.876836 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage\") pod \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.876883 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcq6p\" (UniqueName: \"kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p\") pod \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.876954 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt\") pod \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\" (UID: \"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7\") " Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.877218 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" (UID: "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.883710 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p" (OuterVolumeSpecName: "kube-api-access-vcq6p") pod "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" (UID: "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7"). InnerVolumeSpecName "kube-api-access-vcq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.906679 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" (UID: "d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.978535 4856 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.978575 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcq6p\" (UniqueName: \"kubernetes.io/projected/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-kube-api-access-vcq6p\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:06 crc kubenswrapper[4856]: I0320 13:38:06.978588 4856 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:07 crc kubenswrapper[4856]: I0320 13:38:07.546048 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-n8c5c" event={"ID":"d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7","Type":"ContainerDied","Data":"c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f"} Mar 20 13:38:07 crc kubenswrapper[4856]: I0320 13:38:07.546457 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0405f24163ffcee256b7560f85406c7a5f8b64ecec832deb2638ede4657f98f" Mar 20 13:38:07 crc kubenswrapper[4856]: I0320 13:38:07.546091 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-n8c5c" Mar 20 13:38:09 crc kubenswrapper[4856]: I0320 13:38:09.988038 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:09 crc kubenswrapper[4856]: I0320 13:38:09.988339 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:13 crc kubenswrapper[4856]: I0320 13:38:13.060806 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-85td9" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.820553 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v"] Mar 20 13:38:14 crc kubenswrapper[4856]: E0320 13:38:14.820935 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8408c48-989f-4f69-b388-c8d1d0e2e8ea" containerName="oc" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.820959 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8408c48-989f-4f69-b388-c8d1d0e2e8ea" containerName="oc" Mar 20 13:38:14 crc kubenswrapper[4856]: E0320 13:38:14.820978 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" containerName="storage" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.820991 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" containerName="storage" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.821191 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8408c48-989f-4f69-b388-c8d1d0e2e8ea" containerName="oc" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.821209 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" containerName="storage" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.822494 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.826243 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.839546 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v"] Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.994308 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rvcf\" (UniqueName: \"kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.994408 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:14 crc kubenswrapper[4856]: I0320 13:38:14.994595 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.096891 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.095995 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.097052 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.097824 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.098104 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rvcf\" (UniqueName: \"kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.134668 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rvcf\" (UniqueName: \"kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.151531 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.383377 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v"] Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.609222 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerStarted","Data":"7192efe18298493b3661f2ff2ed18545120fbd0024aa9d1c687ec6b2c86ed639"} Mar 20 13:38:15 crc kubenswrapper[4856]: I0320 13:38:15.609734 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerStarted","Data":"cea13e0c439c2822fff1b3ce2c2fb72aa45a25ea2f04fafd0a4a6fe7a1a9ef3f"} Mar 20 13:38:16 crc kubenswrapper[4856]: I0320 13:38:16.621534 4856 generic.go:334] "Generic (PLEG): container finished" podID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerID="7192efe18298493b3661f2ff2ed18545120fbd0024aa9d1c687ec6b2c86ed639" exitCode=0 Mar 20 13:38:16 crc kubenswrapper[4856]: I0320 13:38:16.625997 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerDied","Data":"7192efe18298493b3661f2ff2ed18545120fbd0024aa9d1c687ec6b2c86ed639"} Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.160845 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.162781 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.171668 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.329773 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74q4h\" (UniqueName: \"kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.330044 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.330084 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.431508 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74q4h\" (UniqueName: \"kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.431585 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.431637 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.432226 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.432313 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.450148 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74q4h\" (UniqueName: \"kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h\") pod \"redhat-operators-vq4vn\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.488992 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:17 crc kubenswrapper[4856]: I0320 13:38:17.663702 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:17 crc kubenswrapper[4856]: W0320 13:38:17.667978 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8913175e_8ca7_4618_b12d_3d2f3fe0b803.slice/crio-572b604617af7e07ac34b8c66115441cbe92fbf5bf75e49907826a4351aaeacb WatchSource:0}: Error finding container 572b604617af7e07ac34b8c66115441cbe92fbf5bf75e49907826a4351aaeacb: Status 404 returned error can't find the container with id 572b604617af7e07ac34b8c66115441cbe92fbf5bf75e49907826a4351aaeacb Mar 20 13:38:18 crc kubenswrapper[4856]: I0320 13:38:18.634367 4856 generic.go:334] "Generic (PLEG): container finished" podID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerID="fd0d7e1b48e7fb5a694d203fd37e626584d96125bc9d7f5a1f64e61744da3e75" exitCode=0 Mar 20 13:38:18 crc kubenswrapper[4856]: I0320 13:38:18.634427 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerDied","Data":"fd0d7e1b48e7fb5a694d203fd37e626584d96125bc9d7f5a1f64e61744da3e75"} Mar 20 13:38:18 crc kubenswrapper[4856]: I0320 13:38:18.634476 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerStarted","Data":"572b604617af7e07ac34b8c66115441cbe92fbf5bf75e49907826a4351aaeacb"} Mar 20 13:38:18 crc kubenswrapper[4856]: I0320 13:38:18.636710 4856 generic.go:334] "Generic (PLEG): container finished" podID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerID="e8436ad4cf1c66760396abe770cb61500116c1026a7fe8ed440cc3a3c0f38775" exitCode=0 Mar 20 13:38:18 crc kubenswrapper[4856]: I0320 13:38:18.636759 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerDied","Data":"e8436ad4cf1c66760396abe770cb61500116c1026a7fe8ed440cc3a3c0f38775"} Mar 20 13:38:19 crc kubenswrapper[4856]: I0320 13:38:19.649791 4856 generic.go:334] "Generic (PLEG): container finished" podID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerID="6b006a55a37d067c49b2577f162f2fbea94ddcfb7cb1fbf9ba7dcd4cbd1e90ff" exitCode=0 Mar 20 13:38:19 crc kubenswrapper[4856]: I0320 13:38:19.650018 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerDied","Data":"6b006a55a37d067c49b2577f162f2fbea94ddcfb7cb1fbf9ba7dcd4cbd1e90ff"} Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.665042 4856 generic.go:334] "Generic (PLEG): container finished" podID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerID="2e6ce131cfea236737b17f9e234610db7bb5afcd154a42a3d15bdb6a05d96c07" exitCode=0 Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.665107 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerDied","Data":"2e6ce131cfea236737b17f9e234610db7bb5afcd154a42a3d15bdb6a05d96c07"} Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.889470 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.977639 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle\") pod \"69075990-e502-4fda-b16b-ec1247bcb9a2\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.977845 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rvcf\" (UniqueName: \"kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf\") pod \"69075990-e502-4fda-b16b-ec1247bcb9a2\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.977936 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util\") pod \"69075990-e502-4fda-b16b-ec1247bcb9a2\" (UID: \"69075990-e502-4fda-b16b-ec1247bcb9a2\") " Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.978327 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle" (OuterVolumeSpecName: "bundle") pod "69075990-e502-4fda-b16b-ec1247bcb9a2" (UID: "69075990-e502-4fda-b16b-ec1247bcb9a2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.979567 4856 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:20 crc kubenswrapper[4856]: I0320 13:38:20.984573 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf" (OuterVolumeSpecName: "kube-api-access-8rvcf") pod "69075990-e502-4fda-b16b-ec1247bcb9a2" (UID: "69075990-e502-4fda-b16b-ec1247bcb9a2"). InnerVolumeSpecName "kube-api-access-8rvcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.080725 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rvcf\" (UniqueName: \"kubernetes.io/projected/69075990-e502-4fda-b16b-ec1247bcb9a2-kube-api-access-8rvcf\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.214352 4856 scope.go:117] "RemoveContainer" containerID="ef5cba9659c688772e38e670b24db43c7564df4637cba6e8a73545d54b51819b" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.282424 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util" (OuterVolumeSpecName: "util") pod "69075990-e502-4fda-b16b-ec1247bcb9a2" (UID: "69075990-e502-4fda-b16b-ec1247bcb9a2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.283893 4856 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/69075990-e502-4fda-b16b-ec1247bcb9a2-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.673546 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.673553 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v" event={"ID":"69075990-e502-4fda-b16b-ec1247bcb9a2","Type":"ContainerDied","Data":"cea13e0c439c2822fff1b3ce2c2fb72aa45a25ea2f04fafd0a4a6fe7a1a9ef3f"} Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.674875 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea13e0c439c2822fff1b3ce2c2fb72aa45a25ea2f04fafd0a4a6fe7a1a9ef3f" Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.676986 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerStarted","Data":"db9ce9ea05073dbec40f2b382b27f87eb0cf48a148f5bbecad48cc263c7e496a"} Mar 20 13:38:21 crc kubenswrapper[4856]: I0320 13:38:21.700951 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vq4vn" podStartSLOduration=2.000079384 podStartE2EDuration="4.700922663s" podCreationTimestamp="2026-03-20 13:38:17 +0000 UTC" firstStartedPulling="2026-03-20 13:38:18.636286634 +0000 UTC m=+913.517312814" lastFinishedPulling="2026-03-20 13:38:21.337129953 +0000 UTC m=+916.218156093" observedRunningTime="2026-03-20 13:38:21.698621022 +0000 UTC m=+916.579647182" watchObservedRunningTime="2026-03-20 13:38:21.700922663 +0000 UTC m=+916.581948833" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.720879 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2"] Mar 20 13:38:24 crc kubenswrapper[4856]: E0320 13:38:24.721131 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="extract" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.721145 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="extract" Mar 20 13:38:24 crc kubenswrapper[4856]: E0320 13:38:24.721169 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="pull" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.721177 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="pull" Mar 20 13:38:24 crc kubenswrapper[4856]: E0320 13:38:24.721189 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="util" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.721199 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="util" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.721334 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="69075990-e502-4fda-b16b-ec1247bcb9a2" containerName="extract" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.721751 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.723868 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.724429 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-t6gqz" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.724921 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.733007 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2"] Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.829943 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px866\" (UniqueName: \"kubernetes.io/projected/69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9-kube-api-access-px866\") pod \"nmstate-operator-796d4cfff4-s6jj2\" (UID: \"69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.931309 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px866\" (UniqueName: \"kubernetes.io/projected/69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9-kube-api-access-px866\") pod \"nmstate-operator-796d4cfff4-s6jj2\" (UID: \"69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" Mar 20 13:38:24 crc kubenswrapper[4856]: I0320 13:38:24.958379 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px866\" (UniqueName: \"kubernetes.io/projected/69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9-kube-api-access-px866\") pod \"nmstate-operator-796d4cfff4-s6jj2\" (UID: \"69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" Mar 20 13:38:25 crc kubenswrapper[4856]: I0320 13:38:25.034982 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" Mar 20 13:38:25 crc kubenswrapper[4856]: I0320 13:38:25.471174 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2"] Mar 20 13:38:25 crc kubenswrapper[4856]: I0320 13:38:25.702516 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" event={"ID":"69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9","Type":"ContainerStarted","Data":"1e082764242f73c3deded875f253800d0c97ad7e293ecef3d7ffde56a30d5cf2"} Mar 20 13:38:27 crc kubenswrapper[4856]: I0320 13:38:27.490117 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:27 crc kubenswrapper[4856]: I0320 13:38:27.490352 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:28 crc kubenswrapper[4856]: I0320 13:38:28.533650 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vq4vn" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="registry-server" probeResult="failure" output=< Mar 20 13:38:28 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:38:28 crc kubenswrapper[4856]: > Mar 20 13:38:33 crc kubenswrapper[4856]: I0320 13:38:33.769935 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" event={"ID":"69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9","Type":"ContainerStarted","Data":"bf177d7000e225106850d9b10db1923372f80b1d9bf3fba178d7a7f8a2f8c51e"} Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.774562 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-s6jj2" podStartSLOduration=3.63936921 podStartE2EDuration="10.774538834s" podCreationTimestamp="2026-03-20 13:38:24 +0000 UTC" firstStartedPulling="2026-03-20 13:38:25.479498882 +0000 UTC m=+920.360525012" lastFinishedPulling="2026-03-20 13:38:32.614668476 +0000 UTC m=+927.495694636" observedRunningTime="2026-03-20 13:38:33.798723518 +0000 UTC m=+928.679749688" watchObservedRunningTime="2026-03-20 13:38:34.774538834 +0000 UTC m=+929.655564984" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.777824 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.779344 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.781854 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-jt56k" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.782647 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2cppz"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.783472 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.787750 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.802638 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2cppz"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.809512 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-kc6cb"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.810383 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.819544 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.860563 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c7n2\" (UniqueName: \"kubernetes.io/projected/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-kube-api-access-5c7n2\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.860864 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.861009 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4mdg\" (UniqueName: \"kubernetes.io/projected/dd1541ea-2421-4a7c-b572-c77955f3f748-kube-api-access-h4mdg\") pod \"nmstate-metrics-9b8c8685d-vc7rs\" (UID: \"dd1541ea-2421-4a7c-b572-c77955f3f748\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.928076 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.929041 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.932715 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.932774 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.932834 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-d2jw8" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.938969 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn"] Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962134 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-ovs-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962245 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c7n2\" (UniqueName: \"kubernetes.io/projected/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-kube-api-access-5c7n2\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962321 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-dbus-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962344 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962371 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-nmstate-lock\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962390 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4mdg\" (UniqueName: \"kubernetes.io/projected/dd1541ea-2421-4a7c-b572-c77955f3f748-kube-api-access-h4mdg\") pod \"nmstate-metrics-9b8c8685d-vc7rs\" (UID: \"dd1541ea-2421-4a7c-b572-c77955f3f748\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.962415 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rdg9\" (UniqueName: \"kubernetes.io/projected/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-kube-api-access-2rdg9\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:34 crc kubenswrapper[4856]: E0320 13:38:34.962678 4856 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 13:38:34 crc kubenswrapper[4856]: E0320 13:38:34.962780 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair podName:57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:35.462757294 +0000 UTC m=+930.343783514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair") pod "nmstate-webhook-5f558f5558-2cppz" (UID: "57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1") : secret "openshift-nmstate-webhook" not found Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.980426 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4mdg\" (UniqueName: \"kubernetes.io/projected/dd1541ea-2421-4a7c-b572-c77955f3f748-kube-api-access-h4mdg\") pod \"nmstate-metrics-9b8c8685d-vc7rs\" (UID: \"dd1541ea-2421-4a7c-b572-c77955f3f748\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" Mar 20 13:38:34 crc kubenswrapper[4856]: I0320 13:38:34.982665 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c7n2\" (UniqueName: \"kubernetes.io/projected/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-kube-api-access-5c7n2\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064328 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-nmstate-lock\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064374 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rdg9\" (UniqueName: \"kubernetes.io/projected/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-kube-api-access-2rdg9\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064395 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f42479ac-4c93-4072-a6e1-3055d25b5dfd-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064428 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-ovs-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064462 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv5zf\" (UniqueName: \"kubernetes.io/projected/f42479ac-4c93-4072-a6e1-3055d25b5dfd-kube-api-access-gv5zf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064508 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-ovs-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064584 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f42479ac-4c93-4072-a6e1-3055d25b5dfd-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064657 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-dbus-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.064739 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-nmstate-lock\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.065008 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-dbus-socket\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.099337 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.100926 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rdg9\" (UniqueName: \"kubernetes.io/projected/59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870-kube-api-access-2rdg9\") pod \"nmstate-handler-kc6cb\" (UID: \"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870\") " pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.126374 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f6f6d7fc5-rvlbt"] Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.126385 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.127146 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.140003 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6f6d7fc5-rvlbt"] Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.167359 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f42479ac-4c93-4072-a6e1-3055d25b5dfd-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.167442 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv5zf\" (UniqueName: \"kubernetes.io/projected/f42479ac-4c93-4072-a6e1-3055d25b5dfd-kube-api-access-gv5zf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.167471 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f42479ac-4c93-4072-a6e1-3055d25b5dfd-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.173397 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f42479ac-4c93-4072-a6e1-3055d25b5dfd-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.174680 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f42479ac-4c93-4072-a6e1-3055d25b5dfd-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.189323 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv5zf\" (UniqueName: \"kubernetes.io/projected/f42479ac-4c93-4072-a6e1-3055d25b5dfd-kube-api-access-gv5zf\") pod \"nmstate-console-plugin-86f58fcf4-bvbkn\" (UID: \"f42479ac-4c93-4072-a6e1-3055d25b5dfd\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.248784 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.268796 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-oauth-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.268872 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.269208 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h996h\" (UniqueName: \"kubernetes.io/projected/ac902a97-b154-41b0-a71b-fae125139617-kube-api-access-h996h\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.269444 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-service-ca\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.269527 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-oauth-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.269555 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-console-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.269604 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-trusted-ca-bundle\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.299922 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs"] Mar 20 13:38:35 crc kubenswrapper[4856]: W0320 13:38:35.310335 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd1541ea_2421_4a7c_b572_c77955f3f748.slice/crio-881914a2dc568731b988d92a930785799d6a899d3ca2bb13a54246b3e46c6493 WatchSource:0}: Error finding container 881914a2dc568731b988d92a930785799d6a899d3ca2bb13a54246b3e46c6493: Status 404 returned error can't find the container with id 881914a2dc568731b988d92a930785799d6a899d3ca2bb13a54246b3e46c6493 Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371037 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-oauth-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371086 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-console-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371105 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-trusted-ca-bundle\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371153 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-oauth-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371173 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371216 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h996h\" (UniqueName: \"kubernetes.io/projected/ac902a97-b154-41b0-a71b-fae125139617-kube-api-access-h996h\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.371288 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-service-ca\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.372261 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-oauth-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.372307 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-service-ca\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.372316 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-trusted-ca-bundle\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.372872 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ac902a97-b154-41b0-a71b-fae125139617-console-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.378965 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-serving-cert\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.381199 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ac902a97-b154-41b0-a71b-fae125139617-console-oauth-config\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.388876 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h996h\" (UniqueName: \"kubernetes.io/projected/ac902a97-b154-41b0-a71b-fae125139617-kube-api-access-h996h\") pod \"console-7f6f6d7fc5-rvlbt\" (UID: \"ac902a97-b154-41b0-a71b-fae125139617\") " pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.472643 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.477728 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2cppz\" (UID: \"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.486714 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.694808 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn"] Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.715977 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.761614 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f6f6d7fc5-rvlbt"] Mar 20 13:38:35 crc kubenswrapper[4856]: W0320 13:38:35.764951 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac902a97_b154_41b0_a71b_fae125139617.slice/crio-5b5f33015b233e6f42368c26da01d44c594ad1c0b2cb65621060ec0863717c5c WatchSource:0}: Error finding container 5b5f33015b233e6f42368c26da01d44c594ad1c0b2cb65621060ec0863717c5c: Status 404 returned error can't find the container with id 5b5f33015b233e6f42368c26da01d44c594ad1c0b2cb65621060ec0863717c5c Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.781229 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" event={"ID":"f42479ac-4c93-4072-a6e1-3055d25b5dfd","Type":"ContainerStarted","Data":"32103346281b6079542f06980080a38a9776bc541eecc40e79520c058974bba4"} Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.782042 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kc6cb" event={"ID":"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870","Type":"ContainerStarted","Data":"d4fa4b2aaabb92e225c2f04ce6b62166ffbdbbdff7f91b2665d7f1ab7e1c9398"} Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.782992 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6f6d7fc5-rvlbt" event={"ID":"ac902a97-b154-41b0-a71b-fae125139617","Type":"ContainerStarted","Data":"5b5f33015b233e6f42368c26da01d44c594ad1c0b2cb65621060ec0863717c5c"} Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.784136 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" event={"ID":"dd1541ea-2421-4a7c-b572-c77955f3f748","Type":"ContainerStarted","Data":"881914a2dc568731b988d92a930785799d6a899d3ca2bb13a54246b3e46c6493"} Mar 20 13:38:35 crc kubenswrapper[4856]: I0320 13:38:35.912035 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2cppz"] Mar 20 13:38:35 crc kubenswrapper[4856]: W0320 13:38:35.917533 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57f6e9d7_6cfd_47f8_99a5_7e2dc054bbd1.slice/crio-cb9de42431be066a13c23d5cb2aaddb77eed7dd545c99d0fb54ecd71b10b5470 WatchSource:0}: Error finding container cb9de42431be066a13c23d5cb2aaddb77eed7dd545c99d0fb54ecd71b10b5470: Status 404 returned error can't find the container with id cb9de42431be066a13c23d5cb2aaddb77eed7dd545c99d0fb54ecd71b10b5470 Mar 20 13:38:36 crc kubenswrapper[4856]: I0320 13:38:36.803149 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f6f6d7fc5-rvlbt" event={"ID":"ac902a97-b154-41b0-a71b-fae125139617","Type":"ContainerStarted","Data":"f47d52c93087e0a84ee374d1f4df5e4e5e931506743f0911bbf4c3aa15e86bd0"} Mar 20 13:38:36 crc kubenswrapper[4856]: I0320 13:38:36.806866 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" event={"ID":"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1","Type":"ContainerStarted","Data":"cb9de42431be066a13c23d5cb2aaddb77eed7dd545c99d0fb54ecd71b10b5470"} Mar 20 13:38:36 crc kubenswrapper[4856]: I0320 13:38:36.823103 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f6f6d7fc5-rvlbt" podStartSLOduration=1.8230844 podStartE2EDuration="1.8230844s" podCreationTimestamp="2026-03-20 13:38:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:38:36.819755033 +0000 UTC m=+931.700781163" watchObservedRunningTime="2026-03-20 13:38:36.8230844 +0000 UTC m=+931.704110520" Mar 20 13:38:37 crc kubenswrapper[4856]: I0320 13:38:37.531948 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:37 crc kubenswrapper[4856]: I0320 13:38:37.571338 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:37 crc kubenswrapper[4856]: I0320 13:38:37.762912 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:38 crc kubenswrapper[4856]: I0320 13:38:38.818563 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vq4vn" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="registry-server" containerID="cri-o://db9ce9ea05073dbec40f2b382b27f87eb0cf48a148f5bbecad48cc263c7e496a" gracePeriod=2 Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.843740 4856 generic.go:334] "Generic (PLEG): container finished" podID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerID="db9ce9ea05073dbec40f2b382b27f87eb0cf48a148f5bbecad48cc263c7e496a" exitCode=0 Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.843810 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerDied","Data":"db9ce9ea05073dbec40f2b382b27f87eb0cf48a148f5bbecad48cc263c7e496a"} Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.988041 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.988115 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.988176 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.988864 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:38:39 crc kubenswrapper[4856]: I0320 13:38:39.988934 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3" gracePeriod=600 Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.414689 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.541826 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities\") pod \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.541943 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74q4h\" (UniqueName: \"kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h\") pod \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.541997 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content\") pod \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\" (UID: \"8913175e-8ca7-4618-b12d-3d2f3fe0b803\") " Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.543363 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities" (OuterVolumeSpecName: "utilities") pod "8913175e-8ca7-4618-b12d-3d2f3fe0b803" (UID: "8913175e-8ca7-4618-b12d-3d2f3fe0b803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.547667 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h" (OuterVolumeSpecName: "kube-api-access-74q4h") pod "8913175e-8ca7-4618-b12d-3d2f3fe0b803" (UID: "8913175e-8ca7-4618-b12d-3d2f3fe0b803"). InnerVolumeSpecName "kube-api-access-74q4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.643503 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74q4h\" (UniqueName: \"kubernetes.io/projected/8913175e-8ca7-4618-b12d-3d2f3fe0b803-kube-api-access-74q4h\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.643727 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.715156 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8913175e-8ca7-4618-b12d-3d2f3fe0b803" (UID: "8913175e-8ca7-4618-b12d-3d2f3fe0b803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.744373 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8913175e-8ca7-4618-b12d-3d2f3fe0b803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.852246 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" event={"ID":"f42479ac-4c93-4072-a6e1-3055d25b5dfd","Type":"ContainerStarted","Data":"7b7317d1d67d9f121375c77333d14d1c2e9246045612f717aff074948e17c7dd"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.854907 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-kc6cb" event={"ID":"59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870","Type":"ContainerStarted","Data":"06b50677cf76b5cb90834d00de150f53e4c357ba3b5827d6b22693f1f35e70b4"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.855083 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.857664 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3" exitCode=0 Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.857711 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.857784 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.857817 4856 scope.go:117] "RemoveContainer" containerID="22105727d36a642919389fde41d5e0048dd797f35a5cd95a22485e2d7ccc90be" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.863934 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" event={"ID":"dd1541ea-2421-4a7c-b572-c77955f3f748","Type":"ContainerStarted","Data":"cf23437a78dc628e5bce5e54649c7bf7c91a00e90b9a92fbccccc22505ab4a37"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.867367 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" event={"ID":"57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1","Type":"ContainerStarted","Data":"aff17ad9f59731a6bd20de0dfafd5a0146700666f7dcf6882fc619a13a55a7ea"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.867479 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.881330 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vq4vn" event={"ID":"8913175e-8ca7-4618-b12d-3d2f3fe0b803","Type":"ContainerDied","Data":"572b604617af7e07ac34b8c66115441cbe92fbf5bf75e49907826a4351aaeacb"} Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.881831 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vq4vn" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.891031 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bvbkn" podStartSLOduration=2.127528252 podStartE2EDuration="6.890992976s" podCreationTimestamp="2026-03-20 13:38:34 +0000 UTC" firstStartedPulling="2026-03-20 13:38:35.699374952 +0000 UTC m=+930.580401092" lastFinishedPulling="2026-03-20 13:38:40.462839686 +0000 UTC m=+935.343865816" observedRunningTime="2026-03-20 13:38:40.881456715 +0000 UTC m=+935.762482865" watchObservedRunningTime="2026-03-20 13:38:40.890992976 +0000 UTC m=+935.772019136" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.919488 4856 scope.go:117] "RemoveContainer" containerID="db9ce9ea05073dbec40f2b382b27f87eb0cf48a148f5bbecad48cc263c7e496a" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.940595 4856 scope.go:117] "RemoveContainer" containerID="2e6ce131cfea236737b17f9e234610db7bb5afcd154a42a3d15bdb6a05d96c07" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.942382 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-kc6cb" podStartSLOduration=2.1621885 podStartE2EDuration="6.942367123s" podCreationTimestamp="2026-03-20 13:38:34 +0000 UTC" firstStartedPulling="2026-03-20 13:38:35.168461024 +0000 UTC m=+930.049487154" lastFinishedPulling="2026-03-20 13:38:39.948639647 +0000 UTC m=+934.829665777" observedRunningTime="2026-03-20 13:38:40.936822288 +0000 UTC m=+935.817848438" watchObservedRunningTime="2026-03-20 13:38:40.942367123 +0000 UTC m=+935.823393253" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.960718 4856 scope.go:117] "RemoveContainer" containerID="fd0d7e1b48e7fb5a694d203fd37e626584d96125bc9d7f5a1f64e61744da3e75" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.962558 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" podStartSLOduration=2.422018032 podStartE2EDuration="6.962520773s" podCreationTimestamp="2026-03-20 13:38:34 +0000 UTC" firstStartedPulling="2026-03-20 13:38:35.919414108 +0000 UTC m=+930.800440238" lastFinishedPulling="2026-03-20 13:38:40.459916829 +0000 UTC m=+935.340942979" observedRunningTime="2026-03-20 13:38:40.95556785 +0000 UTC m=+935.836594010" watchObservedRunningTime="2026-03-20 13:38:40.962520773 +0000 UTC m=+935.843546923" Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.978779 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:40 crc kubenswrapper[4856]: I0320 13:38:40.984241 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vq4vn"] Mar 20 13:38:41 crc kubenswrapper[4856]: I0320 13:38:41.832929 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" path="/var/lib/kubelet/pods/8913175e-8ca7-4618-b12d-3d2f3fe0b803/volumes" Mar 20 13:38:43 crc kubenswrapper[4856]: I0320 13:38:43.918246 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" event={"ID":"dd1541ea-2421-4a7c-b572-c77955f3f748","Type":"ContainerStarted","Data":"f61213a1b486f8355677d44837e9b044e10644338add7b99081408f1d71b0663"} Mar 20 13:38:43 crc kubenswrapper[4856]: I0320 13:38:43.942639 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-vc7rs" podStartSLOduration=2.185656817 podStartE2EDuration="9.942607552s" podCreationTimestamp="2026-03-20 13:38:34 +0000 UTC" firstStartedPulling="2026-03-20 13:38:35.31303566 +0000 UTC m=+930.194061800" lastFinishedPulling="2026-03-20 13:38:43.069986385 +0000 UTC m=+937.951012535" observedRunningTime="2026-03-20 13:38:43.934811837 +0000 UTC m=+938.815838067" watchObservedRunningTime="2026-03-20 13:38:43.942607552 +0000 UTC m=+938.823633722" Mar 20 13:38:45 crc kubenswrapper[4856]: I0320 13:38:45.157960 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-kc6cb" Mar 20 13:38:45 crc kubenswrapper[4856]: I0320 13:38:45.487710 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:45 crc kubenswrapper[4856]: I0320 13:38:45.488044 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:45 crc kubenswrapper[4856]: I0320 13:38:45.493336 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:45 crc kubenswrapper[4856]: I0320 13:38:45.940738 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f6f6d7fc5-rvlbt" Mar 20 13:38:46 crc kubenswrapper[4856]: I0320 13:38:46.019112 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:38:55 crc kubenswrapper[4856]: I0320 13:38:55.726677 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2cppz" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.949507 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v"] Mar 20 13:39:09 crc kubenswrapper[4856]: E0320 13:39:09.952919 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="registry-server" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.953169 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="registry-server" Mar 20 13:39:09 crc kubenswrapper[4856]: E0320 13:39:09.953442 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="extract-utilities" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.953687 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="extract-utilities" Mar 20 13:39:09 crc kubenswrapper[4856]: E0320 13:39:09.953992 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="extract-content" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.954216 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="extract-content" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.955079 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8913175e-8ca7-4618-b12d-3d2f3fe0b803" containerName="registry-server" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.958816 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.965608 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:39:09 crc kubenswrapper[4856]: I0320 13:39:09.973559 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v"] Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.089589 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.089677 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.089738 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt5t4\" (UniqueName: \"kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.190815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt5t4\" (UniqueName: \"kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.190953 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.191036 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.191846 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.191966 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.223819 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt5t4\" (UniqueName: \"kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.283056 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:10 crc kubenswrapper[4856]: I0320 13:39:10.750038 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v"] Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.080527 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jwjhv" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" containerID="cri-o://12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131" gracePeriod=15 Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.132999 4856 generic.go:334] "Generic (PLEG): container finished" podID="473deb71-de01-4290-a70d-f21794be8f0e" containerID="c78bd25a7103fe049a17cea2644d87ec379fa2dbf336d7a6705dee8f251b9b1f" exitCode=0 Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.133063 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerDied","Data":"c78bd25a7103fe049a17cea2644d87ec379fa2dbf336d7a6705dee8f251b9b1f"} Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.133111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerStarted","Data":"8335fcaaa9d592f1468fffd669a2bddde30887ad9f80eafad99de90f13b251d2"} Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.490452 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jwjhv_dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4/console/0.log" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.490830 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.607885 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.607943 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj9m9\" (UniqueName: \"kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.607987 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.608040 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.608065 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.608087 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.608180 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config\") pod \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\" (UID: \"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4\") " Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.609095 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.609186 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.609242 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config" (OuterVolumeSpecName: "console-config") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.609816 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.616880 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9" (OuterVolumeSpecName: "kube-api-access-tj9m9") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "kube-api-access-tj9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.616936 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.620259 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" (UID: "dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710064 4856 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710119 4856 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710139 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tj9m9\" (UniqueName: \"kubernetes.io/projected/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-kube-api-access-tj9m9\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710161 4856 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710180 4856 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710197 4856 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:11 crc kubenswrapper[4856]: I0320 13:39:11.710216 4856 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.144810 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jwjhv_dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4/console/0.log" Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.144892 4856 generic.go:334] "Generic (PLEG): container finished" podID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerID="12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131" exitCode=2 Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.144938 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jwjhv" event={"ID":"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4","Type":"ContainerDied","Data":"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131"} Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.144976 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jwjhv" event={"ID":"dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4","Type":"ContainerDied","Data":"ff53990062952169dd1f8a2c4acf4fa8c3ddb756e792909fe2e0b62cfa94eb7b"} Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.145022 4856 scope.go:117] "RemoveContainer" containerID="12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131" Mar 20 13:39:12 crc kubenswrapper[4856]: I0320 13:39:12.145215 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jwjhv" Mar 20 13:39:13 crc kubenswrapper[4856]: I0320 13:39:12.177828 4856 scope.go:117] "RemoveContainer" containerID="12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131" Mar 20 13:39:13 crc kubenswrapper[4856]: E0320 13:39:12.178621 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131\": container with ID starting with 12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131 not found: ID does not exist" containerID="12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131" Mar 20 13:39:13 crc kubenswrapper[4856]: I0320 13:39:12.178673 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131"} err="failed to get container status \"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131\": rpc error: code = NotFound desc = could not find container \"12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131\": container with ID starting with 12c266a0b19b63653618f2028cc82ac5f05909e8ae8b7a17f95a8a0bdf70f131 not found: ID does not exist" Mar 20 13:39:13 crc kubenswrapper[4856]: I0320 13:39:12.180407 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:39:13 crc kubenswrapper[4856]: I0320 13:39:12.188067 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jwjhv"] Mar 20 13:39:13 crc kubenswrapper[4856]: I0320 13:39:13.831832 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" path="/var/lib/kubelet/pods/dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4/volumes" Mar 20 13:39:14 crc kubenswrapper[4856]: I0320 13:39:14.163033 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerStarted","Data":"36f530c2c202cb39f9a87758069561be2b5b43cd565432b9d642c1b41096e258"} Mar 20 13:39:15 crc kubenswrapper[4856]: I0320 13:39:15.179557 4856 generic.go:334] "Generic (PLEG): container finished" podID="473deb71-de01-4290-a70d-f21794be8f0e" containerID="36f530c2c202cb39f9a87758069561be2b5b43cd565432b9d642c1b41096e258" exitCode=0 Mar 20 13:39:15 crc kubenswrapper[4856]: I0320 13:39:15.179687 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerDied","Data":"36f530c2c202cb39f9a87758069561be2b5b43cd565432b9d642c1b41096e258"} Mar 20 13:39:16 crc kubenswrapper[4856]: I0320 13:39:16.192918 4856 generic.go:334] "Generic (PLEG): container finished" podID="473deb71-de01-4290-a70d-f21794be8f0e" containerID="362b279382373d0b565b4f537aae428501b99068bd4615ad1a4643651358c41b" exitCode=0 Mar 20 13:39:16 crc kubenswrapper[4856]: I0320 13:39:16.193072 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerDied","Data":"362b279382373d0b565b4f537aae428501b99068bd4615ad1a4643651358c41b"} Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.493241 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.601652 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt5t4\" (UniqueName: \"kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4\") pod \"473deb71-de01-4290-a70d-f21794be8f0e\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.601756 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util\") pod \"473deb71-de01-4290-a70d-f21794be8f0e\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.601821 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle\") pod \"473deb71-de01-4290-a70d-f21794be8f0e\" (UID: \"473deb71-de01-4290-a70d-f21794be8f0e\") " Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.603426 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle" (OuterVolumeSpecName: "bundle") pod "473deb71-de01-4290-a70d-f21794be8f0e" (UID: "473deb71-de01-4290-a70d-f21794be8f0e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.611489 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4" (OuterVolumeSpecName: "kube-api-access-bt5t4") pod "473deb71-de01-4290-a70d-f21794be8f0e" (UID: "473deb71-de01-4290-a70d-f21794be8f0e"). InnerVolumeSpecName "kube-api-access-bt5t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.626236 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util" (OuterVolumeSpecName: "util") pod "473deb71-de01-4290-a70d-f21794be8f0e" (UID: "473deb71-de01-4290-a70d-f21794be8f0e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.703912 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt5t4\" (UniqueName: \"kubernetes.io/projected/473deb71-de01-4290-a70d-f21794be8f0e-kube-api-access-bt5t4\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.703955 4856 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:17 crc kubenswrapper[4856]: I0320 13:39:17.703968 4856 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/473deb71-de01-4290-a70d-f21794be8f0e-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:18 crc kubenswrapper[4856]: I0320 13:39:18.209937 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" event={"ID":"473deb71-de01-4290-a70d-f21794be8f0e","Type":"ContainerDied","Data":"8335fcaaa9d592f1468fffd669a2bddde30887ad9f80eafad99de90f13b251d2"} Mar 20 13:39:18 crc kubenswrapper[4856]: I0320 13:39:18.210003 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8335fcaaa9d592f1468fffd669a2bddde30887ad9f80eafad99de90f13b251d2" Mar 20 13:39:18 crc kubenswrapper[4856]: I0320 13:39:18.210471 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024168 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j"] Mar 20 13:39:28 crc kubenswrapper[4856]: E0320 13:39:28.024766 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="pull" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024782 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="pull" Mar 20 13:39:28 crc kubenswrapper[4856]: E0320 13:39:28.024796 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="extract" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024804 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="extract" Mar 20 13:39:28 crc kubenswrapper[4856]: E0320 13:39:28.024822 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024831 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" Mar 20 13:39:28 crc kubenswrapper[4856]: E0320 13:39:28.024845 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="util" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024853 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="util" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024973 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="473deb71-de01-4290-a70d-f21794be8f0e" containerName="extract" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.024993 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1d6d53-b4f3-4b21-bd32-b51edb57e5c4" containerName="console" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.026544 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.030144 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.030254 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-8sg47" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.030251 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.030618 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.031074 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.035905 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xjpv\" (UniqueName: \"kubernetes.io/projected/c40497b5-5353-4d09-b108-88f673dc8f13-kube-api-access-5xjpv\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.035975 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-webhook-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.036003 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-apiservice-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.098565 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j"] Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.136822 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-webhook-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.136864 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-apiservice-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.136926 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xjpv\" (UniqueName: \"kubernetes.io/projected/c40497b5-5353-4d09-b108-88f673dc8f13-kube-api-access-5xjpv\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.146954 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-apiservice-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.147076 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c40497b5-5353-4d09-b108-88f673dc8f13-webhook-cert\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.174796 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xjpv\" (UniqueName: \"kubernetes.io/projected/c40497b5-5353-4d09-b108-88f673dc8f13-kube-api-access-5xjpv\") pod \"metallb-operator-controller-manager-5cbf85c554-nqz8j\" (UID: \"c40497b5-5353-4d09-b108-88f673dc8f13\") " pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.278216 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5"] Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.279065 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.281450 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.281718 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-v729h" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.282593 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.290730 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5"] Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.339860 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-webhook-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.339934 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/97ab1eaf-cfd7-47b3-afc8-c8327065108c-kube-api-access-rjhtm\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.339967 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-apiservice-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.344729 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.443708 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-webhook-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.443784 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/97ab1eaf-cfd7-47b3-afc8-c8327065108c-kube-api-access-rjhtm\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.443815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-apiservice-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.449668 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-apiservice-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.451959 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97ab1eaf-cfd7-47b3-afc8-c8327065108c-webhook-cert\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.480169 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjhtm\" (UniqueName: \"kubernetes.io/projected/97ab1eaf-cfd7-47b3-afc8-c8327065108c-kube-api-access-rjhtm\") pod \"metallb-operator-webhook-server-7c5d84d5d5-qg7c5\" (UID: \"97ab1eaf-cfd7-47b3-afc8-c8327065108c\") " pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.567099 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j"] Mar 20 13:39:28 crc kubenswrapper[4856]: W0320 13:39:28.573531 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40497b5_5353_4d09_b108_88f673dc8f13.slice/crio-e77768b41d4e6add2c58065ed165853db5a7959b060640a93954842ecf498b0d WatchSource:0}: Error finding container e77768b41d4e6add2c58065ed165853db5a7959b060640a93954842ecf498b0d: Status 404 returned error can't find the container with id e77768b41d4e6add2c58065ed165853db5a7959b060640a93954842ecf498b0d Mar 20 13:39:28 crc kubenswrapper[4856]: I0320 13:39:28.595287 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:29 crc kubenswrapper[4856]: I0320 13:39:29.043699 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5"] Mar 20 13:39:29 crc kubenswrapper[4856]: W0320 13:39:29.050960 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97ab1eaf_cfd7_47b3_afc8_c8327065108c.slice/crio-94d0627f29bbdcdb0793b58e6fe318ce136b1f5c29118b8b6dd3b0e3509e37d4 WatchSource:0}: Error finding container 94d0627f29bbdcdb0793b58e6fe318ce136b1f5c29118b8b6dd3b0e3509e37d4: Status 404 returned error can't find the container with id 94d0627f29bbdcdb0793b58e6fe318ce136b1f5c29118b8b6dd3b0e3509e37d4 Mar 20 13:39:29 crc kubenswrapper[4856]: I0320 13:39:29.290381 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" event={"ID":"c40497b5-5353-4d09-b108-88f673dc8f13","Type":"ContainerStarted","Data":"e77768b41d4e6add2c58065ed165853db5a7959b060640a93954842ecf498b0d"} Mar 20 13:39:29 crc kubenswrapper[4856]: I0320 13:39:29.292339 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" event={"ID":"97ab1eaf-cfd7-47b3-afc8-c8327065108c","Type":"ContainerStarted","Data":"94d0627f29bbdcdb0793b58e6fe318ce136b1f5c29118b8b6dd3b0e3509e37d4"} Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.348179 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" event={"ID":"97ab1eaf-cfd7-47b3-afc8-c8327065108c","Type":"ContainerStarted","Data":"71ed226e77ce91e0a22d763ea79e37a39a6c665e9bfee32491412388120a070b"} Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.348764 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.350478 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" event={"ID":"c40497b5-5353-4d09-b108-88f673dc8f13","Type":"ContainerStarted","Data":"e0f262069c61699e87a78de951655d88b26f70bf9132e24311c88b9085399214"} Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.350619 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.366252 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" podStartSLOduration=1.6688336499999998 podStartE2EDuration="6.36623391s" podCreationTimestamp="2026-03-20 13:39:28 +0000 UTC" firstStartedPulling="2026-03-20 13:39:29.053465636 +0000 UTC m=+983.934491766" lastFinishedPulling="2026-03-20 13:39:33.750865856 +0000 UTC m=+988.631892026" observedRunningTime="2026-03-20 13:39:34.36434218 +0000 UTC m=+989.245368330" watchObservedRunningTime="2026-03-20 13:39:34.36623391 +0000 UTC m=+989.247260040" Mar 20 13:39:34 crc kubenswrapper[4856]: I0320 13:39:34.384648 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" podStartSLOduration=1.218647172 podStartE2EDuration="6.384634183s" podCreationTimestamp="2026-03-20 13:39:28 +0000 UTC" firstStartedPulling="2026-03-20 13:39:28.575526839 +0000 UTC m=+983.456552969" lastFinishedPulling="2026-03-20 13:39:33.74151381 +0000 UTC m=+988.622539980" observedRunningTime="2026-03-20 13:39:34.3826079 +0000 UTC m=+989.263634050" watchObservedRunningTime="2026-03-20 13:39:34.384634183 +0000 UTC m=+989.265660313" Mar 20 13:39:48 crc kubenswrapper[4856]: I0320 13:39:48.599998 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7c5d84d5d5-qg7c5" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.675721 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.678132 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.686245 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.765569 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml9kc\" (UniqueName: \"kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.765829 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.765987 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.867237 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.867330 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.867416 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml9kc\" (UniqueName: \"kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.867889 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.868210 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.889095 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml9kc\" (UniqueName: \"kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc\") pod \"certified-operators-6rj7q\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:51 crc kubenswrapper[4856]: I0320 13:39:51.997323 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:39:52 crc kubenswrapper[4856]: I0320 13:39:52.430086 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:39:52 crc kubenswrapper[4856]: I0320 13:39:52.456055 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerStarted","Data":"0de011f93d41b09f5ad8caf1640c55c9b8b7ee3221e9fb739a0e75039423d8e7"} Mar 20 13:39:53 crc kubenswrapper[4856]: I0320 13:39:53.463718 4856 generic.go:334] "Generic (PLEG): container finished" podID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerID="a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6" exitCode=0 Mar 20 13:39:53 crc kubenswrapper[4856]: I0320 13:39:53.463858 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerDied","Data":"a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6"} Mar 20 13:39:55 crc kubenswrapper[4856]: I0320 13:39:55.483416 4856 generic.go:334] "Generic (PLEG): container finished" podID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerID="3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c" exitCode=0 Mar 20 13:39:55 crc kubenswrapper[4856]: I0320 13:39:55.483511 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerDied","Data":"3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c"} Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.888617 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.893643 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.900127 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.942015 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc4l2\" (UniqueName: \"kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.942110 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:56 crc kubenswrapper[4856]: I0320 13:39:56.942297 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.043478 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc4l2\" (UniqueName: \"kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.043570 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.043621 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.044109 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.044449 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.070167 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc4l2\" (UniqueName: \"kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2\") pod \"redhat-marketplace-lzl6g\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.218833 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.441928 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.498534 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerStarted","Data":"8d3494a283631d16c82dcf1398f010e4063488f803633ab75474b367941aff76"} Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.503510 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerStarted","Data":"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e"} Mar 20 13:39:57 crc kubenswrapper[4856]: I0320 13:39:57.523093 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rj7q" podStartSLOduration=2.952419732 podStartE2EDuration="6.523079854s" podCreationTimestamp="2026-03-20 13:39:51 +0000 UTC" firstStartedPulling="2026-03-20 13:39:53.466838385 +0000 UTC m=+1008.347864515" lastFinishedPulling="2026-03-20 13:39:57.037498507 +0000 UTC m=+1011.918524637" observedRunningTime="2026-03-20 13:39:57.522074338 +0000 UTC m=+1012.403100548" watchObservedRunningTime="2026-03-20 13:39:57.523079854 +0000 UTC m=+1012.404105984" Mar 20 13:39:58 crc kubenswrapper[4856]: I0320 13:39:58.512237 4856 generic.go:334] "Generic (PLEG): container finished" podID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerID="357463eacd2c4e5f13045f3fee26166a450b0819176e2568471cf31e68a58020" exitCode=0 Mar 20 13:39:58 crc kubenswrapper[4856]: I0320 13:39:58.512297 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerDied","Data":"357463eacd2c4e5f13045f3fee26166a450b0819176e2568471cf31e68a58020"} Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.149836 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-7pp6m"] Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.150854 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.154186 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.159855 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.160301 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.165681 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-7pp6m"] Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.200365 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnn5b\" (UniqueName: \"kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b\") pod \"auto-csr-approver-29566900-7pp6m\" (UID: \"6e4380ff-fdce-457f-a1cf-0a5ed46754a0\") " pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.301976 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnn5b\" (UniqueName: \"kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b\") pod \"auto-csr-approver-29566900-7pp6m\" (UID: \"6e4380ff-fdce-457f-a1cf-0a5ed46754a0\") " pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.331058 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnn5b\" (UniqueName: \"kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b\") pod \"auto-csr-approver-29566900-7pp6m\" (UID: \"6e4380ff-fdce-457f-a1cf-0a5ed46754a0\") " pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.485255 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.528946 4856 generic.go:334] "Generic (PLEG): container finished" podID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerID="c03534640e2329a84ec3890ad41a2236b36338e5f6e852f1006406e4361e9894" exitCode=0 Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.529025 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerDied","Data":"c03534640e2329a84ec3890ad41a2236b36338e5f6e852f1006406e4361e9894"} Mar 20 13:40:00 crc kubenswrapper[4856]: I0320 13:40:00.967939 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-7pp6m"] Mar 20 13:40:01 crc kubenswrapper[4856]: I0320 13:40:01.540064 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" event={"ID":"6e4380ff-fdce-457f-a1cf-0a5ed46754a0","Type":"ContainerStarted","Data":"a0b25a82927c6c08ecdc98e2ceab80590fad7817c8ed8cfc9caf7f795c2885ce"} Mar 20 13:40:01 crc kubenswrapper[4856]: I0320 13:40:01.998229 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:01 crc kubenswrapper[4856]: I0320 13:40:01.998534 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:02 crc kubenswrapper[4856]: I0320 13:40:02.046145 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:02 crc kubenswrapper[4856]: I0320 13:40:02.547697 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerStarted","Data":"da5cecda24f74ee322b6814a218802e345f531ef689169072cc0d6b603a9d49b"} Mar 20 13:40:02 crc kubenswrapper[4856]: I0320 13:40:02.568038 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lzl6g" podStartSLOduration=3.516893263 podStartE2EDuration="6.568020477s" podCreationTimestamp="2026-03-20 13:39:56 +0000 UTC" firstStartedPulling="2026-03-20 13:39:58.514383977 +0000 UTC m=+1013.395410107" lastFinishedPulling="2026-03-20 13:40:01.565511171 +0000 UTC m=+1016.446537321" observedRunningTime="2026-03-20 13:40:02.565971234 +0000 UTC m=+1017.446997384" watchObservedRunningTime="2026-03-20 13:40:02.568020477 +0000 UTC m=+1017.449046627" Mar 20 13:40:02 crc kubenswrapper[4856]: I0320 13:40:02.601874 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:03 crc kubenswrapper[4856]: I0320 13:40:03.475645 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:40:04 crc kubenswrapper[4856]: I0320 13:40:04.564839 4856 generic.go:334] "Generic (PLEG): container finished" podID="6e4380ff-fdce-457f-a1cf-0a5ed46754a0" containerID="bc3db4dd9b67dde3de860b22891ed04c026a2b4409012901bbf29b4c1ab5f56b" exitCode=0 Mar 20 13:40:04 crc kubenswrapper[4856]: I0320 13:40:04.564957 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" event={"ID":"6e4380ff-fdce-457f-a1cf-0a5ed46754a0","Type":"ContainerDied","Data":"bc3db4dd9b67dde3de860b22891ed04c026a2b4409012901bbf29b4c1ab5f56b"} Mar 20 13:40:04 crc kubenswrapper[4856]: I0320 13:40:04.565568 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rj7q" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="registry-server" containerID="cri-o://4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e" gracePeriod=2 Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.027086 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.167049 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content\") pod \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.167115 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities\") pod \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.167157 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml9kc\" (UniqueName: \"kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc\") pod \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\" (UID: \"34b7a29e-b7ed-4f61-a4da-868c4e4053ad\") " Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.168120 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities" (OuterVolumeSpecName: "utilities") pod "34b7a29e-b7ed-4f61-a4da-868c4e4053ad" (UID: "34b7a29e-b7ed-4f61-a4da-868c4e4053ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.175554 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc" (OuterVolumeSpecName: "kube-api-access-ml9kc") pod "34b7a29e-b7ed-4f61-a4da-868c4e4053ad" (UID: "34b7a29e-b7ed-4f61-a4da-868c4e4053ad"). InnerVolumeSpecName "kube-api-access-ml9kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.268254 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.268320 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml9kc\" (UniqueName: \"kubernetes.io/projected/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-kube-api-access-ml9kc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.271477 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34b7a29e-b7ed-4f61-a4da-868c4e4053ad" (UID: "34b7a29e-b7ed-4f61-a4da-868c4e4053ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.369110 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b7a29e-b7ed-4f61-a4da-868c4e4053ad-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.577401 4856 generic.go:334] "Generic (PLEG): container finished" podID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerID="4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e" exitCode=0 Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.577491 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerDied","Data":"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e"} Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.577584 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rj7q" event={"ID":"34b7a29e-b7ed-4f61-a4da-868c4e4053ad","Type":"ContainerDied","Data":"0de011f93d41b09f5ad8caf1640c55c9b8b7ee3221e9fb739a0e75039423d8e7"} Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.577628 4856 scope.go:117] "RemoveContainer" containerID="4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.578537 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rj7q" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.611955 4856 scope.go:117] "RemoveContainer" containerID="3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.633658 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.638715 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rj7q"] Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.659223 4856 scope.go:117] "RemoveContainer" containerID="a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.694727 4856 scope.go:117] "RemoveContainer" containerID="4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e" Mar 20 13:40:05 crc kubenswrapper[4856]: E0320 13:40:05.695195 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e\": container with ID starting with 4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e not found: ID does not exist" containerID="4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.695254 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e"} err="failed to get container status \"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e\": rpc error: code = NotFound desc = could not find container \"4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e\": container with ID starting with 4ea54a4294826e68c6a6dfac6ffc4fe7dcdb2d8df29592f4782ef10946541b4e not found: ID does not exist" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.695315 4856 scope.go:117] "RemoveContainer" containerID="3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c" Mar 20 13:40:05 crc kubenswrapper[4856]: E0320 13:40:05.695607 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c\": container with ID starting with 3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c not found: ID does not exist" containerID="3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.695661 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c"} err="failed to get container status \"3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c\": rpc error: code = NotFound desc = could not find container \"3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c\": container with ID starting with 3fd24acee8c79b1b6b556cab83bccc7124ffd17cc4e3009063977ff83c9b391c not found: ID does not exist" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.695685 4856 scope.go:117] "RemoveContainer" containerID="a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6" Mar 20 13:40:05 crc kubenswrapper[4856]: E0320 13:40:05.695965 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6\": container with ID starting with a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6 not found: ID does not exist" containerID="a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.696022 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6"} err="failed to get container status \"a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6\": rpc error: code = NotFound desc = could not find container \"a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6\": container with ID starting with a6b7009eb9e2c6ce438564711178aca2bacb5f69aa75cad0b1f59801e3f226f6 not found: ID does not exist" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.812892 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.833204 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" path="/var/lib/kubelet/pods/34b7a29e-b7ed-4f61-a4da-868c4e4053ad/volumes" Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.975070 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnn5b\" (UniqueName: \"kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b\") pod \"6e4380ff-fdce-457f-a1cf-0a5ed46754a0\" (UID: \"6e4380ff-fdce-457f-a1cf-0a5ed46754a0\") " Mar 20 13:40:05 crc kubenswrapper[4856]: I0320 13:40:05.979223 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b" (OuterVolumeSpecName: "kube-api-access-nnn5b") pod "6e4380ff-fdce-457f-a1cf-0a5ed46754a0" (UID: "6e4380ff-fdce-457f-a1cf-0a5ed46754a0"). InnerVolumeSpecName "kube-api-access-nnn5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.077713 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnn5b\" (UniqueName: \"kubernetes.io/projected/6e4380ff-fdce-457f-a1cf-0a5ed46754a0-kube-api-access-nnn5b\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.587753 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" event={"ID":"6e4380ff-fdce-457f-a1cf-0a5ed46754a0","Type":"ContainerDied","Data":"a0b25a82927c6c08ecdc98e2ceab80590fad7817c8ed8cfc9caf7f795c2885ce"} Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.588472 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0b25a82927c6c08ecdc98e2ceab80590fad7817c8ed8cfc9caf7f795c2885ce" Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.587823 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-7pp6m" Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.888465 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-28tg6"] Mar 20 13:40:06 crc kubenswrapper[4856]: I0320 13:40:06.895130 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-28tg6"] Mar 20 13:40:07 crc kubenswrapper[4856]: I0320 13:40:07.218971 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:07 crc kubenswrapper[4856]: I0320 13:40:07.219094 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:07 crc kubenswrapper[4856]: I0320 13:40:07.278568 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:07 crc kubenswrapper[4856]: I0320 13:40:07.644523 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:07 crc kubenswrapper[4856]: I0320 13:40:07.827363 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bd55705-a77e-4bc9-941b-eaf18f2fc458" path="/var/lib/kubelet/pods/6bd55705-a77e-4bc9-941b-eaf18f2fc458/volumes" Mar 20 13:40:08 crc kubenswrapper[4856]: I0320 13:40:08.348133 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5cbf85c554-nqz8j" Mar 20 13:40:08 crc kubenswrapper[4856]: I0320 13:40:08.870442 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.020673 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-stt59"] Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.020939 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="registry-server" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.020958 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="registry-server" Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.020976 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="extract-content" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.020983 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="extract-content" Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.020997 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="extract-utilities" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.021005 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="extract-utilities" Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.021018 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e4380ff-fdce-457f-a1cf-0a5ed46754a0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.021025 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e4380ff-fdce-457f-a1cf-0a5ed46754a0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.021164 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e4380ff-fdce-457f-a1cf-0a5ed46754a0" containerName="oc" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.021182 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b7a29e-b7ed-4f61-a4da-868c4e4053ad" containerName="registry-server" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.023495 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.025685 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-q9p8r" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.026468 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.027008 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.039109 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.040356 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.041915 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.053775 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.177167 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-gjqm4"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.178249 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.180113 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.180582 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.180835 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.180998 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-bdtc7" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.183135 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-hc6d5"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.184225 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.187937 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.203614 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hc6d5"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.216359 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-reloader\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.216415 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d087c80-beb6-4bdd-b00a-248658c0378c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217054 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-metrics\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217102 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-conf\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217174 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-sockets\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217408 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75f4513d-63d1-4433-b319-038b189e4be5-frr-startup\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217528 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mctnq\" (UniqueName: \"kubernetes.io/projected/75f4513d-63d1-4433-b319-038b189e4be5-kube-api-access-mctnq\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217574 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75f4513d-63d1-4433-b319-038b189e4be5-metrics-certs\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.217614 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txphn\" (UniqueName: \"kubernetes.io/projected/7d087c80-beb6-4bdd-b00a-248658c0378c-kube-api-access-txphn\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318237 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-cert\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318352 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75f4513d-63d1-4433-b319-038b189e4be5-frr-startup\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318385 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-metrics-certs\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318413 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrzgr\" (UniqueName: \"kubernetes.io/projected/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-kube-api-access-rrzgr\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318434 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mctnq\" (UniqueName: \"kubernetes.io/projected/75f4513d-63d1-4433-b319-038b189e4be5-kube-api-access-mctnq\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318515 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metrics-certs\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318587 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75f4513d-63d1-4433-b319-038b189e4be5-metrics-certs\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318614 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txphn\" (UniqueName: \"kubernetes.io/projected/7d087c80-beb6-4bdd-b00a-248658c0378c-kube-api-access-txphn\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318665 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8sdj\" (UniqueName: \"kubernetes.io/projected/c947915a-318f-480b-b4cb-96024bb62eb3-kube-api-access-w8sdj\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318693 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-reloader\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318712 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d087c80-beb6-4bdd-b00a-248658c0378c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318808 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-metrics\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318870 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-conf\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318934 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metallb-excludel2\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.318995 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-sockets\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319172 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-metrics\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319169 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-reloader\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319370 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-conf\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319412 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75f4513d-63d1-4433-b319-038b189e4be5-frr-sockets\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319115 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.319539 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75f4513d-63d1-4433-b319-038b189e4be5-frr-startup\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.324921 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75f4513d-63d1-4433-b319-038b189e4be5-metrics-certs\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.333835 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7d087c80-beb6-4bdd-b00a-248658c0378c-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.335640 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mctnq\" (UniqueName: \"kubernetes.io/projected/75f4513d-63d1-4433-b319-038b189e4be5-kube-api-access-mctnq\") pod \"frr-k8s-stt59\" (UID: \"75f4513d-63d1-4433-b319-038b189e4be5\") " pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.336475 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txphn\" (UniqueName: \"kubernetes.io/projected/7d087c80-beb6-4bdd-b00a-248658c0378c-kube-api-access-txphn\") pod \"frr-k8s-webhook-server-bcc4b6f68-s4f2d\" (UID: \"7d087c80-beb6-4bdd-b00a-248658c0378c\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.340809 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.354929 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420118 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8sdj\" (UniqueName: \"kubernetes.io/projected/c947915a-318f-480b-b4cb-96024bb62eb3-kube-api-access-w8sdj\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420195 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metallb-excludel2\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420217 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420233 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-cert\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420284 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-metrics-certs\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420313 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrzgr\" (UniqueName: \"kubernetes.io/projected/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-kube-api-access-rrzgr\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420329 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metrics-certs\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.420696 4856 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:40:09 crc kubenswrapper[4856]: E0320 13:40:09.420763 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist podName:c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e nodeName:}" failed. No retries permitted until 2026-03-20 13:40:09.920745598 +0000 UTC m=+1024.801771718 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist") pod "speaker-gjqm4" (UID: "c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e") : secret "metallb-memberlist" not found Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.420962 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metallb-excludel2\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.423477 4856 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.425344 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-metrics-certs\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.426487 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-metrics-certs\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.433923 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c947915a-318f-480b-b4cb-96024bb62eb3-cert\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.445402 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8sdj\" (UniqueName: \"kubernetes.io/projected/c947915a-318f-480b-b4cb-96024bb62eb3-kube-api-access-w8sdj\") pod \"controller-7bb4cc7c98-hc6d5\" (UID: \"c947915a-318f-480b-b4cb-96024bb62eb3\") " pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.449442 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrzgr\" (UniqueName: \"kubernetes.io/projected/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-kube-api-access-rrzgr\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.499919 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.563291 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d"] Mar 20 13:40:09 crc kubenswrapper[4856]: W0320 13:40:09.566597 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d087c80_beb6_4bdd_b00a_248658c0378c.slice/crio-61cf2c98346bb4ecc206f0fde1910773cc13f9a629f89de4061c6b1e808e2c08 WatchSource:0}: Error finding container 61cf2c98346bb4ecc206f0fde1910773cc13f9a629f89de4061c6b1e808e2c08: Status 404 returned error can't find the container with id 61cf2c98346bb4ecc206f0fde1910773cc13f9a629f89de4061c6b1e808e2c08 Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.608039 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" event={"ID":"7d087c80-beb6-4bdd-b00a-248658c0378c","Type":"ContainerStarted","Data":"61cf2c98346bb4ecc206f0fde1910773cc13f9a629f89de4061c6b1e808e2c08"} Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.608171 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lzl6g" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="registry-server" containerID="cri-o://da5cecda24f74ee322b6814a218802e345f531ef689169072cc0d6b603a9d49b" gracePeriod=2 Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.705914 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hc6d5"] Mar 20 13:40:09 crc kubenswrapper[4856]: W0320 13:40:09.712728 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc947915a_318f_480b_b4cb_96024bb62eb3.slice/crio-08020d5100bd4c55d4ae5679b77f09edd628f19a69c54ef55a6f90a2b707d475 WatchSource:0}: Error finding container 08020d5100bd4c55d4ae5679b77f09edd628f19a69c54ef55a6f90a2b707d475: Status 404 returned error can't find the container with id 08020d5100bd4c55d4ae5679b77f09edd628f19a69c54ef55a6f90a2b707d475 Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.888968 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.917068 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.930413 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.931627 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:09 crc kubenswrapper[4856]: I0320 13:40:09.940775 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e-memberlist\") pod \"speaker-gjqm4\" (UID: \"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e\") " pod="metallb-system/speaker-gjqm4" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.033337 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqvx\" (UniqueName: \"kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.033401 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.033598 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.093200 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-gjqm4" Mar 20 13:40:10 crc kubenswrapper[4856]: W0320 13:40:10.107606 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0c6db6f_8ff3_4d0e_9172_5d9a7ed83a8e.slice/crio-b006cc19f18de594bcb1fe52b5b93f259d59cf3e338d6e5d1eda68a23810082a WatchSource:0}: Error finding container b006cc19f18de594bcb1fe52b5b93f259d59cf3e338d6e5d1eda68a23810082a: Status 404 returned error can't find the container with id b006cc19f18de594bcb1fe52b5b93f259d59cf3e338d6e5d1eda68a23810082a Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.134814 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqvx\" (UniqueName: \"kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.134863 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.134941 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.135452 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.135479 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.153576 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqvx\" (UniqueName: \"kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx\") pod \"community-operators-dknq9\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.260104 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.617903 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"66c58a684f44d6e2962cb69a8f46c815023a0eea18da98c57a281df4b8d156f7"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.624152 4856 generic.go:334] "Generic (PLEG): container finished" podID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerID="da5cecda24f74ee322b6814a218802e345f531ef689169072cc0d6b603a9d49b" exitCode=0 Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.624201 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerDied","Data":"da5cecda24f74ee322b6814a218802e345f531ef689169072cc0d6b603a9d49b"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.624223 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lzl6g" event={"ID":"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942","Type":"ContainerDied","Data":"8d3494a283631d16c82dcf1398f010e4063488f803633ab75474b367941aff76"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.624233 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d3494a283631d16c82dcf1398f010e4063488f803633ab75474b367941aff76" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.626319 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjqm4" event={"ID":"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e","Type":"ContainerStarted","Data":"1b91d19deb091492e8821332bb5508b7986aae4080a32b826c281b7312b374f9"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.626342 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjqm4" event={"ID":"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e","Type":"ContainerStarted","Data":"b006cc19f18de594bcb1fe52b5b93f259d59cf3e338d6e5d1eda68a23810082a"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.628421 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hc6d5" event={"ID":"c947915a-318f-480b-b4cb-96024bb62eb3","Type":"ContainerStarted","Data":"e6216b40cc2d4c31122bdd85951be1bced8f92f96d1546c0fd47677a3ecb0020"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.628494 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hc6d5" event={"ID":"c947915a-318f-480b-b4cb-96024bb62eb3","Type":"ContainerStarted","Data":"ba4f1e9c374b264ac991b8724ae87c58b2f0ba9554e4c117814505ce874c4d42"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.628504 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hc6d5" event={"ID":"c947915a-318f-480b-b4cb-96024bb62eb3","Type":"ContainerStarted","Data":"08020d5100bd4c55d4ae5679b77f09edd628f19a69c54ef55a6f90a2b707d475"} Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.629308 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.639060 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.644932 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-hc6d5" podStartSLOduration=1.644919343 podStartE2EDuration="1.644919343s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:10.643971388 +0000 UTC m=+1025.524997518" watchObservedRunningTime="2026-03-20 13:40:10.644919343 +0000 UTC m=+1025.525945473" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.742476 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content\") pod \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.744489 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc4l2\" (UniqueName: \"kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2\") pod \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.744586 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities\") pod \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\" (UID: \"832cc8e4-5aae-4e5c-bba0-43f0d3eb5942\") " Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.745319 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities" (OuterVolumeSpecName: "utilities") pod "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" (UID: "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.750822 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2" (OuterVolumeSpecName: "kube-api-access-rc4l2") pod "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" (UID: "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942"). InnerVolumeSpecName "kube-api-access-rc4l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.770127 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" (UID: "832cc8e4-5aae-4e5c-bba0-43f0d3eb5942"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.845907 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.845946 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc4l2\" (UniqueName: \"kubernetes.io/projected/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-kube-api-access-rc4l2\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.845960 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:10 crc kubenswrapper[4856]: I0320 13:40:10.893988 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.641244 4856 generic.go:334] "Generic (PLEG): container finished" podID="c7575129-b339-490d-b407-9750a357c413" containerID="f39c1d899f0f038b8db6fc3d2a7ba1ef9fb9e24b504f51d1a4eb64bc784a8a64" exitCode=0 Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.641356 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerDied","Data":"f39c1d899f0f038b8db6fc3d2a7ba1ef9fb9e24b504f51d1a4eb64bc784a8a64"} Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.641584 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerStarted","Data":"1a7b83a9e637b3f4e6bcc9afd2467c06344fcd5c6536a0fb7dc8ba599af4933c"} Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.646364 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-gjqm4" event={"ID":"c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e","Type":"ContainerStarted","Data":"28782bcd777aac6a3d825d0221e1501aa8465fcd50cd73ded756f63f13a6541d"} Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.646432 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-gjqm4" Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.646516 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lzl6g" Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.685730 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-gjqm4" podStartSLOduration=2.685712954 podStartE2EDuration="2.685712954s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:11.681694299 +0000 UTC m=+1026.562720449" watchObservedRunningTime="2026-03-20 13:40:11.685712954 +0000 UTC m=+1026.566739084" Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.707134 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.712571 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lzl6g"] Mar 20 13:40:11 crc kubenswrapper[4856]: I0320 13:40:11.829487 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" path="/var/lib/kubelet/pods/832cc8e4-5aae-4e5c-bba0-43f0d3eb5942/volumes" Mar 20 13:40:13 crc kubenswrapper[4856]: I0320 13:40:13.662537 4856 generic.go:334] "Generic (PLEG): container finished" podID="c7575129-b339-490d-b407-9750a357c413" containerID="6ca2f3b3a245a2e4e177dc4081fb7c9006024e8dca5ce4acb5c608deade226b8" exitCode=0 Mar 20 13:40:13 crc kubenswrapper[4856]: I0320 13:40:13.662741 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerDied","Data":"6ca2f3b3a245a2e4e177dc4081fb7c9006024e8dca5ce4acb5c608deade226b8"} Mar 20 13:40:19 crc kubenswrapper[4856]: I0320 13:40:19.505330 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-hc6d5" Mar 20 13:40:19 crc kubenswrapper[4856]: I0320 13:40:19.713980 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerStarted","Data":"d6397215b08c17f492c4f7ee0fd196b3dc25db28ffc1ffe1846dda25c7d1c513"} Mar 20 13:40:19 crc kubenswrapper[4856]: I0320 13:40:19.745041 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-dknq9" podStartSLOduration=3.68857694 podStartE2EDuration="10.745026547s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="2026-03-20 13:40:11.642722226 +0000 UTC m=+1026.523748356" lastFinishedPulling="2026-03-20 13:40:18.699171783 +0000 UTC m=+1033.580197963" observedRunningTime="2026-03-20 13:40:19.744293308 +0000 UTC m=+1034.625319458" watchObservedRunningTime="2026-03-20 13:40:19.745026547 +0000 UTC m=+1034.626052677" Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.097475 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-gjqm4" Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.261118 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.261196 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.722652 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" event={"ID":"7d087c80-beb6-4bdd-b00a-248658c0378c","Type":"ContainerStarted","Data":"7d0ed1c1f10769936b7fabde0168a10ef12f999ffa64147c6341bf1017c99987"} Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.722762 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.724214 4856 generic.go:334] "Generic (PLEG): container finished" podID="75f4513d-63d1-4433-b319-038b189e4be5" containerID="573221d307ff05ef9fb6ddc7602b1625534ef9595d56443b9fe31d1cb7e92eff" exitCode=0 Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.724311 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerDied","Data":"573221d307ff05ef9fb6ddc7602b1625534ef9595d56443b9fe31d1cb7e92eff"} Mar 20 13:40:20 crc kubenswrapper[4856]: I0320 13:40:20.747941 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" podStartSLOduration=1.711522845 podStartE2EDuration="11.747769533s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="2026-03-20 13:40:09.569539694 +0000 UTC m=+1024.450565834" lastFinishedPulling="2026-03-20 13:40:19.605786372 +0000 UTC m=+1034.486812522" observedRunningTime="2026-03-20 13:40:20.74308004 +0000 UTC m=+1035.624106180" watchObservedRunningTime="2026-03-20 13:40:20.747769533 +0000 UTC m=+1035.628795673" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.312392 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-dknq9" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="registry-server" probeResult="failure" output=< Mar 20 13:40:21 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:40:21 crc kubenswrapper[4856]: > Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.350876 4856 scope.go:117] "RemoveContainer" containerID="7c8070ad6b67ff7d3e223a0ed884b7654c48831c61a1d12083ab32c3121e7601" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.473281 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b"] Mar 20 13:40:21 crc kubenswrapper[4856]: E0320 13:40:21.473481 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="extract-utilities" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.473493 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="extract-utilities" Mar 20 13:40:21 crc kubenswrapper[4856]: E0320 13:40:21.473508 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="extract-content" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.473514 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="extract-content" Mar 20 13:40:21 crc kubenswrapper[4856]: E0320 13:40:21.473541 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="registry-server" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.473548 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="registry-server" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.473650 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="832cc8e4-5aae-4e5c-bba0-43f0d3eb5942" containerName="registry-server" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.474379 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.477580 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.487712 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b"] Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.545410 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.545475 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.545546 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w8xf\" (UniqueName: \"kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.663792 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w8xf\" (UniqueName: \"kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.663902 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.663945 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.664709 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.664978 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.688576 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w8xf\" (UniqueName: \"kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.731969 4856 generic.go:334] "Generic (PLEG): container finished" podID="75f4513d-63d1-4433-b319-038b189e4be5" containerID="dad8478f5c9e8b164086e8654105d9b30401f5440c72fad0d45bbfd9221d5bc2" exitCode=0 Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.732099 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerDied","Data":"dad8478f5c9e8b164086e8654105d9b30401f5440c72fad0d45bbfd9221d5bc2"} Mar 20 13:40:21 crc kubenswrapper[4856]: I0320 13:40:21.789933 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:22 crc kubenswrapper[4856]: I0320 13:40:22.233963 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b"] Mar 20 13:40:22 crc kubenswrapper[4856]: W0320 13:40:22.245228 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0756beee_1cb6_48fe_8910_fb87333a83d8.slice/crio-49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315 WatchSource:0}: Error finding container 49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315: Status 404 returned error can't find the container with id 49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315 Mar 20 13:40:22 crc kubenswrapper[4856]: I0320 13:40:22.740950 4856 generic.go:334] "Generic (PLEG): container finished" podID="75f4513d-63d1-4433-b319-038b189e4be5" containerID="2a72fc94fbca63ccbc4538f5dd165adfec052cf61c677533dddf1d593fd86ad8" exitCode=0 Mar 20 13:40:22 crc kubenswrapper[4856]: I0320 13:40:22.741037 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerDied","Data":"2a72fc94fbca63ccbc4538f5dd165adfec052cf61c677533dddf1d593fd86ad8"} Mar 20 13:40:22 crc kubenswrapper[4856]: I0320 13:40:22.747702 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerStarted","Data":"ccb407143f619768b1c3acb84f2b8ac606bacc313f2a6f840f3eacfc325fddf4"} Mar 20 13:40:22 crc kubenswrapper[4856]: I0320 13:40:22.747751 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerStarted","Data":"49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315"} Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.757385 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"afaf6c50ac554d3969eafded2ea7088133f8bd249239eb121960388c6d2b0fab"} Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.757665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"0ca1c3bf3816cda5df2173a1a1d1daf44223e4ed8d88ff0326ad3faa34a1dcaf"} Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.757679 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"c1e4dbf8909eae6db8642af608681ac27799d2c9c547207bf535cb197eddf721"} Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.757690 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"e96f974842d01f9b3400fc72c5383fab84257e3e6a05ba758b254f07db295615"} Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.759565 4856 generic.go:334] "Generic (PLEG): container finished" podID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerID="ccb407143f619768b1c3acb84f2b8ac606bacc313f2a6f840f3eacfc325fddf4" exitCode=0 Mar 20 13:40:23 crc kubenswrapper[4856]: I0320 13:40:23.759607 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerDied","Data":"ccb407143f619768b1c3acb84f2b8ac606bacc313f2a6f840f3eacfc325fddf4"} Mar 20 13:40:24 crc kubenswrapper[4856]: I0320 13:40:24.769387 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"956a0b7afb390f957499acbba5eee2cbaa153ecb105a120f418b0c9e5f13db29"} Mar 20 13:40:25 crc kubenswrapper[4856]: I0320 13:40:25.794231 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-stt59" event={"ID":"75f4513d-63d1-4433-b319-038b189e4be5","Type":"ContainerStarted","Data":"506fb5d168dff5be0b39b505ad72caef89a9fd6b1c570bfc2a9c71172145da32"} Mar 20 13:40:25 crc kubenswrapper[4856]: I0320 13:40:25.794634 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:25 crc kubenswrapper[4856]: I0320 13:40:25.818749 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-stt59" podStartSLOduration=6.989944948 podStartE2EDuration="16.818731991s" podCreationTimestamp="2026-03-20 13:40:09 +0000 UTC" firstStartedPulling="2026-03-20 13:40:09.800908627 +0000 UTC m=+1024.681934757" lastFinishedPulling="2026-03-20 13:40:19.62969565 +0000 UTC m=+1034.510721800" observedRunningTime="2026-03-20 13:40:25.818204268 +0000 UTC m=+1040.699230398" watchObservedRunningTime="2026-03-20 13:40:25.818731991 +0000 UTC m=+1040.699758121" Mar 20 13:40:26 crc kubenswrapper[4856]: I0320 13:40:26.806825 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerStarted","Data":"3eb871130fd57df504141c771366ee03faf067c904464baecd28b94a16a04a39"} Mar 20 13:40:27 crc kubenswrapper[4856]: I0320 13:40:27.815705 4856 generic.go:334] "Generic (PLEG): container finished" podID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerID="3eb871130fd57df504141c771366ee03faf067c904464baecd28b94a16a04a39" exitCode=0 Mar 20 13:40:27 crc kubenswrapper[4856]: I0320 13:40:27.815824 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerDied","Data":"3eb871130fd57df504141c771366ee03faf067c904464baecd28b94a16a04a39"} Mar 20 13:40:28 crc kubenswrapper[4856]: I0320 13:40:28.845376 4856 generic.go:334] "Generic (PLEG): container finished" podID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerID="8f03ec1555f4b621777e3285495101b06e8c0c2de0fca9fb2f34dcb71edde83c" exitCode=0 Mar 20 13:40:28 crc kubenswrapper[4856]: I0320 13:40:28.845458 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerDied","Data":"8f03ec1555f4b621777e3285495101b06e8c0c2de0fca9fb2f34dcb71edde83c"} Mar 20 13:40:29 crc kubenswrapper[4856]: I0320 13:40:29.342199 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:29 crc kubenswrapper[4856]: I0320 13:40:29.362122 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-s4f2d" Mar 20 13:40:29 crc kubenswrapper[4856]: I0320 13:40:29.405916 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.138819 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.187492 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle\") pod \"0756beee-1cb6-48fe-8910-fb87333a83d8\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.187656 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w8xf\" (UniqueName: \"kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf\") pod \"0756beee-1cb6-48fe-8910-fb87333a83d8\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.187697 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util\") pod \"0756beee-1cb6-48fe-8910-fb87333a83d8\" (UID: \"0756beee-1cb6-48fe-8910-fb87333a83d8\") " Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.189304 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle" (OuterVolumeSpecName: "bundle") pod "0756beee-1cb6-48fe-8910-fb87333a83d8" (UID: "0756beee-1cb6-48fe-8910-fb87333a83d8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.194342 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf" (OuterVolumeSpecName: "kube-api-access-2w8xf") pod "0756beee-1cb6-48fe-8910-fb87333a83d8" (UID: "0756beee-1cb6-48fe-8910-fb87333a83d8"). InnerVolumeSpecName "kube-api-access-2w8xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.197768 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util" (OuterVolumeSpecName: "util") pod "0756beee-1cb6-48fe-8910-fb87333a83d8" (UID: "0756beee-1cb6-48fe-8910-fb87333a83d8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.289378 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w8xf\" (UniqueName: \"kubernetes.io/projected/0756beee-1cb6-48fe-8910-fb87333a83d8-kube-api-access-2w8xf\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.289630 4856 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.289639 4856 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0756beee-1cb6-48fe-8910-fb87333a83d8-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.305343 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.361201 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.861524 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.861523 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b" event={"ID":"0756beee-1cb6-48fe-8910-fb87333a83d8","Type":"ContainerDied","Data":"49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315"} Mar 20 13:40:30 crc kubenswrapper[4856]: I0320 13:40:30.861989 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49243b1fb3c6608a9fe0dfe773181defd067073dc3d8a141f66f6fbee66f3315" Mar 20 13:40:32 crc kubenswrapper[4856]: I0320 13:40:32.793337 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:32 crc kubenswrapper[4856]: I0320 13:40:32.793562 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-dknq9" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="registry-server" containerID="cri-o://d6397215b08c17f492c4f7ee0fd196b3dc25db28ffc1ffe1846dda25c7d1c513" gracePeriod=2 Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.886426 4856 generic.go:334] "Generic (PLEG): container finished" podID="c7575129-b339-490d-b407-9750a357c413" containerID="d6397215b08c17f492c4f7ee0fd196b3dc25db28ffc1ffe1846dda25c7d1c513" exitCode=0 Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.886498 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerDied","Data":"d6397215b08c17f492c4f7ee0fd196b3dc25db28ffc1ffe1846dda25c7d1c513"} Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.937765 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7"] Mar 20 13:40:33 crc kubenswrapper[4856]: E0320 13:40:33.938204 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="pull" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.938217 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="pull" Mar 20 13:40:33 crc kubenswrapper[4856]: E0320 13:40:33.938227 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="extract" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.938233 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="extract" Mar 20 13:40:33 crc kubenswrapper[4856]: E0320 13:40:33.938254 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="util" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.938259 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="util" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.938373 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0756beee-1cb6-48fe-8910-fb87333a83d8" containerName="extract" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.938752 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.942243 4856 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-b6px8" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.942484 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.942646 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.954498 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7"] Mar 20 13:40:33 crc kubenswrapper[4856]: I0320 13:40:33.996983 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.037393 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdqvx\" (UniqueName: \"kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx\") pod \"c7575129-b339-490d-b407-9750a357c413\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.037500 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content\") pod \"c7575129-b339-490d-b407-9750a357c413\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.037631 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities\") pod \"c7575129-b339-490d-b407-9750a357c413\" (UID: \"c7575129-b339-490d-b407-9750a357c413\") " Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.037848 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgr2s\" (UniqueName: \"kubernetes.io/projected/b2a35b97-2954-49c2-a3be-8e00544517a4-kube-api-access-mgr2s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.037896 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a35b97-2954-49c2-a3be-8e00544517a4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.044304 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities" (OuterVolumeSpecName: "utilities") pod "c7575129-b339-490d-b407-9750a357c413" (UID: "c7575129-b339-490d-b407-9750a357c413"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.046412 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx" (OuterVolumeSpecName: "kube-api-access-jdqvx") pod "c7575129-b339-490d-b407-9750a357c413" (UID: "c7575129-b339-490d-b407-9750a357c413"). InnerVolumeSpecName "kube-api-access-jdqvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.087621 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7575129-b339-490d-b407-9750a357c413" (UID: "c7575129-b339-490d-b407-9750a357c413"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139228 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgr2s\" (UniqueName: \"kubernetes.io/projected/b2a35b97-2954-49c2-a3be-8e00544517a4-kube-api-access-mgr2s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139291 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a35b97-2954-49c2-a3be-8e00544517a4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139396 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139408 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7575129-b339-490d-b407-9750a357c413-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139417 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdqvx\" (UniqueName: \"kubernetes.io/projected/c7575129-b339-490d-b407-9750a357c413-kube-api-access-jdqvx\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.139796 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b2a35b97-2954-49c2-a3be-8e00544517a4-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.159343 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgr2s\" (UniqueName: \"kubernetes.io/projected/b2a35b97-2954-49c2-a3be-8e00544517a4-kube-api-access-mgr2s\") pod \"cert-manager-operator-controller-manager-66c8bdd694-tq9d7\" (UID: \"b2a35b97-2954-49c2-a3be-8e00544517a4\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.308741 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.718436 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7"] Mar 20 13:40:34 crc kubenswrapper[4856]: W0320 13:40:34.734058 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a35b97_2954_49c2_a3be_8e00544517a4.slice/crio-e383c88897cfdd4251e75a5dd25bfdd777ac4215cdda79a81528bb91ba43d89f WatchSource:0}: Error finding container e383c88897cfdd4251e75a5dd25bfdd777ac4215cdda79a81528bb91ba43d89f: Status 404 returned error can't find the container with id e383c88897cfdd4251e75a5dd25bfdd777ac4215cdda79a81528bb91ba43d89f Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.895196 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dknq9" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.895187 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dknq9" event={"ID":"c7575129-b339-490d-b407-9750a357c413","Type":"ContainerDied","Data":"1a7b83a9e637b3f4e6bcc9afd2467c06344fcd5c6536a0fb7dc8ba599af4933c"} Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.895387 4856 scope.go:117] "RemoveContainer" containerID="d6397215b08c17f492c4f7ee0fd196b3dc25db28ffc1ffe1846dda25c7d1c513" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.896606 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" event={"ID":"b2a35b97-2954-49c2-a3be-8e00544517a4","Type":"ContainerStarted","Data":"e383c88897cfdd4251e75a5dd25bfdd777ac4215cdda79a81528bb91ba43d89f"} Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.915501 4856 scope.go:117] "RemoveContainer" containerID="6ca2f3b3a245a2e4e177dc4081fb7c9006024e8dca5ce4acb5c608deade226b8" Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.936685 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.946091 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dknq9"] Mar 20 13:40:34 crc kubenswrapper[4856]: I0320 13:40:34.948175 4856 scope.go:117] "RemoveContainer" containerID="f39c1d899f0f038b8db6fc3d2a7ba1ef9fb9e24b504f51d1a4eb64bc784a8a64" Mar 20 13:40:35 crc kubenswrapper[4856]: I0320 13:40:35.838533 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7575129-b339-490d-b407-9750a357c413" path="/var/lib/kubelet/pods/c7575129-b339-490d-b407-9750a357c413/volumes" Mar 20 13:40:37 crc kubenswrapper[4856]: I0320 13:40:37.925089 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" event={"ID":"b2a35b97-2954-49c2-a3be-8e00544517a4","Type":"ContainerStarted","Data":"4b49c58bc12c8a0220a10d07d970f348825a13dbb784e5efa44536633aa52f57"} Mar 20 13:40:37 crc kubenswrapper[4856]: I0320 13:40:37.953514 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-tq9d7" podStartSLOduration=2.137275686 podStartE2EDuration="4.953494098s" podCreationTimestamp="2026-03-20 13:40:33 +0000 UTC" firstStartedPulling="2026-03-20 13:40:34.73722635 +0000 UTC m=+1049.618252480" lastFinishedPulling="2026-03-20 13:40:37.553444762 +0000 UTC m=+1052.434470892" observedRunningTime="2026-03-20 13:40:37.947445449 +0000 UTC m=+1052.828471599" watchObservedRunningTime="2026-03-20 13:40:37.953494098 +0000 UTC m=+1052.834520238" Mar 20 13:40:39 crc kubenswrapper[4856]: I0320 13:40:39.350461 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-stt59" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.889136 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bs4xl"] Mar 20 13:40:40 crc kubenswrapper[4856]: E0320 13:40:40.889768 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="extract-content" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.889787 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="extract-content" Mar 20 13:40:40 crc kubenswrapper[4856]: E0320 13:40:40.889806 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="extract-utilities" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.889816 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="extract-utilities" Mar 20 13:40:40 crc kubenswrapper[4856]: E0320 13:40:40.889829 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="registry-server" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.889838 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="registry-server" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.889987 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7575129-b339-490d-b407-9750a357c413" containerName="registry-server" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.890459 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.894521 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.895467 4856 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pmxrh" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.905065 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.918819 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.918939 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bs4xl"] Mar 20 13:40:40 crc kubenswrapper[4856]: I0320 13:40:40.919070 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hscn9\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-kube-api-access-hscn9\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.019932 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hscn9\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-kube-api-access-hscn9\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.020561 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.044441 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hscn9\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-kube-api-access-hscn9\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.044945 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/775f75c4-f69a-4f8f-8946-ef4fb8909ea4-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-bs4xl\" (UID: \"775f75c4-f69a-4f8f-8946-ef4fb8909ea4\") " pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.223844 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.680770 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-bs4xl"] Mar 20 13:40:41 crc kubenswrapper[4856]: W0320 13:40:41.686298 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod775f75c4_f69a_4f8f_8946_ef4fb8909ea4.slice/crio-64d1643adeadfb1a664bc7da41fb1ac1e096e8191deb2ecc7991e1c4788f167a WatchSource:0}: Error finding container 64d1643adeadfb1a664bc7da41fb1ac1e096e8191deb2ecc7991e1c4788f167a: Status 404 returned error can't find the container with id 64d1643adeadfb1a664bc7da41fb1ac1e096e8191deb2ecc7991e1c4788f167a Mar 20 13:40:41 crc kubenswrapper[4856]: I0320 13:40:41.951156 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" event={"ID":"775f75c4-f69a-4f8f-8946-ef4fb8909ea4","Type":"ContainerStarted","Data":"64d1643adeadfb1a664bc7da41fb1ac1e096e8191deb2ecc7991e1c4788f167a"} Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.009572 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-r7hgh"] Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.010714 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.014525 4856 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-xzwnv" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.031337 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-r7hgh"] Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.065726 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.065881 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxxkm\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-kube-api-access-kxxkm\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.166919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.167006 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxxkm\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-kube-api-access-kxxkm\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.190016 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxxkm\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-kube-api-access-kxxkm\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.190020 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e187cb86-add1-4f10-b59f-cf21b0927b2b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-r7hgh\" (UID: \"e187cb86-add1-4f10-b59f-cf21b0927b2b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.334390 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" Mar 20 13:40:44 crc kubenswrapper[4856]: I0320 13:40:44.795248 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-r7hgh"] Mar 20 13:40:45 crc kubenswrapper[4856]: W0320 13:40:45.970054 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode187cb86_add1_4f10_b59f_cf21b0927b2b.slice/crio-34842294d8e6a98f6944307d9f4d3d9b20a6d9e5f1f048fd23c076d104b69c17 WatchSource:0}: Error finding container 34842294d8e6a98f6944307d9f4d3d9b20a6d9e5f1f048fd23c076d104b69c17: Status 404 returned error can't find the container with id 34842294d8e6a98f6944307d9f4d3d9b20a6d9e5f1f048fd23c076d104b69c17 Mar 20 13:40:45 crc kubenswrapper[4856]: I0320 13:40:45.986107 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" event={"ID":"e187cb86-add1-4f10-b59f-cf21b0927b2b","Type":"ContainerStarted","Data":"34842294d8e6a98f6944307d9f4d3d9b20a6d9e5f1f048fd23c076d104b69c17"} Mar 20 13:40:46 crc kubenswrapper[4856]: I0320 13:40:46.993226 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" event={"ID":"775f75c4-f69a-4f8f-8946-ef4fb8909ea4","Type":"ContainerStarted","Data":"c2093a7b9742f3705f04e6b1944084bbcf546f0b70226b11a3ae14bf767cfcc4"} Mar 20 13:40:46 crc kubenswrapper[4856]: I0320 13:40:46.993416 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:48 crc kubenswrapper[4856]: I0320 13:40:48.001355 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" event={"ID":"e187cb86-add1-4f10-b59f-cf21b0927b2b","Type":"ContainerStarted","Data":"7274a3422274b28ff1626b7491c6e85d909331d7390df4c91be608c4868c768a"} Mar 20 13:40:48 crc kubenswrapper[4856]: I0320 13:40:48.021289 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-r7hgh" podStartSLOduration=3.847169773 podStartE2EDuration="5.021255854s" podCreationTimestamp="2026-03-20 13:40:43 +0000 UTC" firstStartedPulling="2026-03-20 13:40:45.972162004 +0000 UTC m=+1060.853188134" lastFinishedPulling="2026-03-20 13:40:47.146248075 +0000 UTC m=+1062.027274215" observedRunningTime="2026-03-20 13:40:48.019897488 +0000 UTC m=+1062.900923638" watchObservedRunningTime="2026-03-20 13:40:48.021255854 +0000 UTC m=+1062.902281994" Mar 20 13:40:48 crc kubenswrapper[4856]: I0320 13:40:48.022609 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" podStartSLOduration=3.677467989 podStartE2EDuration="8.02259992s" podCreationTimestamp="2026-03-20 13:40:40 +0000 UTC" firstStartedPulling="2026-03-20 13:40:41.689612529 +0000 UTC m=+1056.570638659" lastFinishedPulling="2026-03-20 13:40:46.03474446 +0000 UTC m=+1060.915770590" observedRunningTime="2026-03-20 13:40:47.009846979 +0000 UTC m=+1061.890873129" watchObservedRunningTime="2026-03-20 13:40:48.02259992 +0000 UTC m=+1062.903626050" Mar 20 13:40:51 crc kubenswrapper[4856]: I0320 13:40:51.227058 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-bs4xl" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.834228 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-7gbpp"] Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.836225 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.836479 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7gbpp"] Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.838506 4856 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2dvqd" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.873120 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwx7z\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-kube-api-access-vwx7z\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.873515 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-bound-sa-token\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.974845 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwx7z\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-kube-api-access-vwx7z\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.974915 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-bound-sa-token\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.999242 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-bound-sa-token\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:40:59 crc kubenswrapper[4856]: I0320 13:40:59.999820 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwx7z\" (UniqueName: \"kubernetes.io/projected/4ba16d0c-f526-4e60-b685-e4d8b51766af-kube-api-access-vwx7z\") pod \"cert-manager-545d4d4674-7gbpp\" (UID: \"4ba16d0c-f526-4e60-b685-e4d8b51766af\") " pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:41:00 crc kubenswrapper[4856]: I0320 13:41:00.170946 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-7gbpp" Mar 20 13:41:00 crc kubenswrapper[4856]: I0320 13:41:00.399501 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-7gbpp"] Mar 20 13:41:01 crc kubenswrapper[4856]: I0320 13:41:01.083644 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7gbpp" event={"ID":"4ba16d0c-f526-4e60-b685-e4d8b51766af","Type":"ContainerStarted","Data":"24bcaef434e8824d2d747ffa77c5554bd1de594f31eb14ffee829ce33a0f5b3a"} Mar 20 13:41:01 crc kubenswrapper[4856]: I0320 13:41:01.083951 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-7gbpp" event={"ID":"4ba16d0c-f526-4e60-b685-e4d8b51766af","Type":"ContainerStarted","Data":"d8e515ee42b1cb5e14c7bfcd586879687567bce6ca23929b687ded78f8ec5ffe"} Mar 20 13:41:01 crc kubenswrapper[4856]: I0320 13:41:01.102139 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-7gbpp" podStartSLOduration=2.102121538 podStartE2EDuration="2.102121538s" podCreationTimestamp="2026-03-20 13:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:41:01.099672404 +0000 UTC m=+1075.980698544" watchObservedRunningTime="2026-03-20 13:41:01.102121538 +0000 UTC m=+1075.983147678" Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.903547 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.905455 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.912746 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zmhrc" Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.912792 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.913194 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:41:05 crc kubenswrapper[4856]: I0320 13:41:05.914171 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:06 crc kubenswrapper[4856]: I0320 13:41:06.063909 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ns9z\" (UniqueName: \"kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z\") pod \"openstack-operator-index-gbdpf\" (UID: \"e0e27937-ce53-48a6-a37b-ed79002b2686\") " pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:06 crc kubenswrapper[4856]: I0320 13:41:06.166355 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ns9z\" (UniqueName: \"kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z\") pod \"openstack-operator-index-gbdpf\" (UID: \"e0e27937-ce53-48a6-a37b-ed79002b2686\") " pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:06 crc kubenswrapper[4856]: I0320 13:41:06.187981 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ns9z\" (UniqueName: \"kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z\") pod \"openstack-operator-index-gbdpf\" (UID: \"e0e27937-ce53-48a6-a37b-ed79002b2686\") " pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:06 crc kubenswrapper[4856]: I0320 13:41:06.240133 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:06 crc kubenswrapper[4856]: I0320 13:41:06.655480 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:07 crc kubenswrapper[4856]: I0320 13:41:07.132819 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gbdpf" event={"ID":"e0e27937-ce53-48a6-a37b-ed79002b2686","Type":"ContainerStarted","Data":"bfa5c0a4dc10f3015a72d8e09231fc33ada6b56f55869356dd8516f3883c08ec"} Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.243156 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.853172 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-95qtd"] Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.853952 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.868867 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-95qtd"] Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.930381 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncpt5\" (UniqueName: \"kubernetes.io/projected/b2beaa10-d6ab-4f70-a62c-48d441fd3e8b-kube-api-access-ncpt5\") pod \"openstack-operator-index-95qtd\" (UID: \"b2beaa10-d6ab-4f70-a62c-48d441fd3e8b\") " pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.987971 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:09 crc kubenswrapper[4856]: I0320 13:41:09.988341 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.031362 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncpt5\" (UniqueName: \"kubernetes.io/projected/b2beaa10-d6ab-4f70-a62c-48d441fd3e8b-kube-api-access-ncpt5\") pod \"openstack-operator-index-95qtd\" (UID: \"b2beaa10-d6ab-4f70-a62c-48d441fd3e8b\") " pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.050755 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncpt5\" (UniqueName: \"kubernetes.io/projected/b2beaa10-d6ab-4f70-a62c-48d441fd3e8b-kube-api-access-ncpt5\") pod \"openstack-operator-index-95qtd\" (UID: \"b2beaa10-d6ab-4f70-a62c-48d441fd3e8b\") " pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.159592 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gbdpf" event={"ID":"e0e27937-ce53-48a6-a37b-ed79002b2686","Type":"ContainerStarted","Data":"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0"} Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.159709 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-gbdpf" podUID="e0e27937-ce53-48a6-a37b-ed79002b2686" containerName="registry-server" containerID="cri-o://90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0" gracePeriod=2 Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.181603 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-gbdpf" podStartSLOduration=2.047033307 podStartE2EDuration="5.181573356s" podCreationTimestamp="2026-03-20 13:41:05 +0000 UTC" firstStartedPulling="2026-03-20 13:41:06.667262435 +0000 UTC m=+1081.548288565" lastFinishedPulling="2026-03-20 13:41:09.801802484 +0000 UTC m=+1084.682828614" observedRunningTime="2026-03-20 13:41:10.176072802 +0000 UTC m=+1085.057098942" watchObservedRunningTime="2026-03-20 13:41:10.181573356 +0000 UTC m=+1085.062599546" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.264346 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.560940 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.646106 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ns9z\" (UniqueName: \"kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z\") pod \"e0e27937-ce53-48a6-a37b-ed79002b2686\" (UID: \"e0e27937-ce53-48a6-a37b-ed79002b2686\") " Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.649162 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z" (OuterVolumeSpecName: "kube-api-access-5ns9z") pod "e0e27937-ce53-48a6-a37b-ed79002b2686" (UID: "e0e27937-ce53-48a6-a37b-ed79002b2686"). InnerVolumeSpecName "kube-api-access-5ns9z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.748032 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ns9z\" (UniqueName: \"kubernetes.io/projected/e0e27937-ce53-48a6-a37b-ed79002b2686-kube-api-access-5ns9z\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:10 crc kubenswrapper[4856]: I0320 13:41:10.769566 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-95qtd"] Mar 20 13:41:10 crc kubenswrapper[4856]: W0320 13:41:10.773935 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2beaa10_d6ab_4f70_a62c_48d441fd3e8b.slice/crio-9e3acf5ab75340684c35ad2d1add733987664aa2cca898594b1c0ece782663e8 WatchSource:0}: Error finding container 9e3acf5ab75340684c35ad2d1add733987664aa2cca898594b1c0ece782663e8: Status 404 returned error can't find the container with id 9e3acf5ab75340684c35ad2d1add733987664aa2cca898594b1c0ece782663e8 Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.171371 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-95qtd" event={"ID":"b2beaa10-d6ab-4f70-a62c-48d441fd3e8b","Type":"ContainerStarted","Data":"2e7045255859aa24c3ab96fba242915af262a0da67d7988dca13be6665a4b8dd"} Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.171956 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-95qtd" event={"ID":"b2beaa10-d6ab-4f70-a62c-48d441fd3e8b","Type":"ContainerStarted","Data":"9e3acf5ab75340684c35ad2d1add733987664aa2cca898594b1c0ece782663e8"} Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.173828 4856 generic.go:334] "Generic (PLEG): container finished" podID="e0e27937-ce53-48a6-a37b-ed79002b2686" containerID="90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0" exitCode=0 Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.173878 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gbdpf" event={"ID":"e0e27937-ce53-48a6-a37b-ed79002b2686","Type":"ContainerDied","Data":"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0"} Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.173906 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-gbdpf" event={"ID":"e0e27937-ce53-48a6-a37b-ed79002b2686","Type":"ContainerDied","Data":"bfa5c0a4dc10f3015a72d8e09231fc33ada6b56f55869356dd8516f3883c08ec"} Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.173932 4856 scope.go:117] "RemoveContainer" containerID="90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0" Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.174100 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-gbdpf" Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.200692 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-95qtd" podStartSLOduration=2.160037655 podStartE2EDuration="2.200667163s" podCreationTimestamp="2026-03-20 13:41:09 +0000 UTC" firstStartedPulling="2026-03-20 13:41:10.77786891 +0000 UTC m=+1085.658895040" lastFinishedPulling="2026-03-20 13:41:10.818498418 +0000 UTC m=+1085.699524548" observedRunningTime="2026-03-20 13:41:11.196352699 +0000 UTC m=+1086.077378889" watchObservedRunningTime="2026-03-20 13:41:11.200667163 +0000 UTC m=+1086.081693323" Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.216121 4856 scope.go:117] "RemoveContainer" containerID="90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0" Mar 20 13:41:11 crc kubenswrapper[4856]: E0320 13:41:11.216683 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0\": container with ID starting with 90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0 not found: ID does not exist" containerID="90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0" Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.216723 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0"} err="failed to get container status \"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0\": rpc error: code = NotFound desc = could not find container \"90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0\": container with ID starting with 90c04fc9dc9c2577f9e5188a6fa89c1e452df486a8a74d838897ffcd326355a0 not found: ID does not exist" Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.226684 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.233165 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-gbdpf"] Mar 20 13:41:11 crc kubenswrapper[4856]: I0320 13:41:11.832692 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0e27937-ce53-48a6-a37b-ed79002b2686" path="/var/lib/kubelet/pods/e0e27937-ce53-48a6-a37b-ed79002b2686/volumes" Mar 20 13:41:20 crc kubenswrapper[4856]: I0320 13:41:20.266326 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:20 crc kubenswrapper[4856]: I0320 13:41:20.266908 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:20 crc kubenswrapper[4856]: I0320 13:41:20.291472 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:21 crc kubenswrapper[4856]: I0320 13:41:21.288397 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-95qtd" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.503243 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr"] Mar 20 13:41:22 crc kubenswrapper[4856]: E0320 13:41:22.503719 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0e27937-ce53-48a6-a37b-ed79002b2686" containerName="registry-server" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.503745 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0e27937-ce53-48a6-a37b-ed79002b2686" containerName="registry-server" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.503925 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0e27937-ce53-48a6-a37b-ed79002b2686" containerName="registry-server" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.505131 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.507487 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-m9l8k" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.519676 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr"] Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.658435 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.658672 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4nl\" (UniqueName: \"kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.658792 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.760689 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.761626 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.761710 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.761896 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4nl\" (UniqueName: \"kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.762323 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.785087 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4nl\" (UniqueName: \"kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl\") pod \"1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:22 crc kubenswrapper[4856]: I0320 13:41:22.827879 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:23 crc kubenswrapper[4856]: I0320 13:41:23.254357 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr"] Mar 20 13:41:23 crc kubenswrapper[4856]: I0320 13:41:23.280839 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" event={"ID":"cde5a69b-f5f6-401d-88c4-81ad127e860f","Type":"ContainerStarted","Data":"223144388a77f75891d72e590654dd1d892fa2793fc2d9bf19c7c20430dacfc5"} Mar 20 13:41:24 crc kubenswrapper[4856]: I0320 13:41:24.288052 4856 generic.go:334] "Generic (PLEG): container finished" podID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerID="71cd135ac3ccda80d7aebe5353b464b6b2440347880260be1bb130f3ff31362e" exitCode=0 Mar 20 13:41:24 crc kubenswrapper[4856]: I0320 13:41:24.288096 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" event={"ID":"cde5a69b-f5f6-401d-88c4-81ad127e860f","Type":"ContainerDied","Data":"71cd135ac3ccda80d7aebe5353b464b6b2440347880260be1bb130f3ff31362e"} Mar 20 13:41:25 crc kubenswrapper[4856]: I0320 13:41:25.298199 4856 generic.go:334] "Generic (PLEG): container finished" podID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerID="9a2430ac561be164f482cd90251052da4ff844d6e5df8aca26efa92b87693d88" exitCode=0 Mar 20 13:41:25 crc kubenswrapper[4856]: I0320 13:41:25.298345 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" event={"ID":"cde5a69b-f5f6-401d-88c4-81ad127e860f","Type":"ContainerDied","Data":"9a2430ac561be164f482cd90251052da4ff844d6e5df8aca26efa92b87693d88"} Mar 20 13:41:26 crc kubenswrapper[4856]: I0320 13:41:26.309196 4856 generic.go:334] "Generic (PLEG): container finished" podID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerID="f379d1d4271f8f4c5c0246cbd76fb24ce654ffeabc9e8f7b9a62b01f336c3a83" exitCode=0 Mar 20 13:41:26 crc kubenswrapper[4856]: I0320 13:41:26.310699 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" event={"ID":"cde5a69b-f5f6-401d-88c4-81ad127e860f","Type":"ContainerDied","Data":"f379d1d4271f8f4c5c0246cbd76fb24ce654ffeabc9e8f7b9a62b01f336c3a83"} Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.680102 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.838814 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util\") pod \"cde5a69b-f5f6-401d-88c4-81ad127e860f\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.839126 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle\") pod \"cde5a69b-f5f6-401d-88c4-81ad127e860f\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.839207 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t4nl\" (UniqueName: \"kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl\") pod \"cde5a69b-f5f6-401d-88c4-81ad127e860f\" (UID: \"cde5a69b-f5f6-401d-88c4-81ad127e860f\") " Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.839758 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle" (OuterVolumeSpecName: "bundle") pod "cde5a69b-f5f6-401d-88c4-81ad127e860f" (UID: "cde5a69b-f5f6-401d-88c4-81ad127e860f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.848493 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl" (OuterVolumeSpecName: "kube-api-access-8t4nl") pod "cde5a69b-f5f6-401d-88c4-81ad127e860f" (UID: "cde5a69b-f5f6-401d-88c4-81ad127e860f"). InnerVolumeSpecName "kube-api-access-8t4nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.875983 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util" (OuterVolumeSpecName: "util") pod "cde5a69b-f5f6-401d-88c4-81ad127e860f" (UID: "cde5a69b-f5f6-401d-88c4-81ad127e860f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.945342 4856 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.945411 4856 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cde5a69b-f5f6-401d-88c4-81ad127e860f-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:27 crc kubenswrapper[4856]: I0320 13:41:27.945432 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t4nl\" (UniqueName: \"kubernetes.io/projected/cde5a69b-f5f6-401d-88c4-81ad127e860f-kube-api-access-8t4nl\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:28 crc kubenswrapper[4856]: I0320 13:41:28.329789 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" event={"ID":"cde5a69b-f5f6-401d-88c4-81ad127e860f","Type":"ContainerDied","Data":"223144388a77f75891d72e590654dd1d892fa2793fc2d9bf19c7c20430dacfc5"} Mar 20 13:41:28 crc kubenswrapper[4856]: I0320 13:41:28.329844 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="223144388a77f75891d72e590654dd1d892fa2793fc2d9bf19c7c20430dacfc5" Mar 20 13:41:28 crc kubenswrapper[4856]: I0320 13:41:28.329939 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.587522 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf"] Mar 20 13:41:34 crc kubenswrapper[4856]: E0320 13:41:34.589974 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="pull" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.590135 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="pull" Mar 20 13:41:34 crc kubenswrapper[4856]: E0320 13:41:34.590320 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="util" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.590438 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="util" Mar 20 13:41:34 crc kubenswrapper[4856]: E0320 13:41:34.590597 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="extract" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.590726 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="extract" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.591041 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde5a69b-f5f6-401d-88c4-81ad127e860f" containerName="extract" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.591848 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.597704 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-22wvm" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.639209 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf"] Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.735209 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws49x\" (UniqueName: \"kubernetes.io/projected/b64623b8-5be4-4269-891b-bb0154ab18b3-kube-api-access-ws49x\") pod \"openstack-operator-controller-init-59b5998766-f2xcf\" (UID: \"b64623b8-5be4-4269-891b-bb0154ab18b3\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.836545 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws49x\" (UniqueName: \"kubernetes.io/projected/b64623b8-5be4-4269-891b-bb0154ab18b3-kube-api-access-ws49x\") pod \"openstack-operator-controller-init-59b5998766-f2xcf\" (UID: \"b64623b8-5be4-4269-891b-bb0154ab18b3\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.853861 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws49x\" (UniqueName: \"kubernetes.io/projected/b64623b8-5be4-4269-891b-bb0154ab18b3-kube-api-access-ws49x\") pod \"openstack-operator-controller-init-59b5998766-f2xcf\" (UID: \"b64623b8-5be4-4269-891b-bb0154ab18b3\") " pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:34 crc kubenswrapper[4856]: I0320 13:41:34.921434 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:35 crc kubenswrapper[4856]: I0320 13:41:35.203587 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf"] Mar 20 13:41:35 crc kubenswrapper[4856]: I0320 13:41:35.397333 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" event={"ID":"b64623b8-5be4-4269-891b-bb0154ab18b3","Type":"ContainerStarted","Data":"258daa8bcfa1c67c8ba042d5d2566f999faa84297d144273092793e84212db8b"} Mar 20 13:41:39 crc kubenswrapper[4856]: I0320 13:41:39.447549 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" event={"ID":"b64623b8-5be4-4269-891b-bb0154ab18b3","Type":"ContainerStarted","Data":"f06efa33bd8d34bf93bcd9b755a229a0fec6917efbc7dac6a9d06dd7bd23356b"} Mar 20 13:41:39 crc kubenswrapper[4856]: I0320 13:41:39.988019 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:39 crc kubenswrapper[4856]: I0320 13:41:39.988084 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:40 crc kubenswrapper[4856]: I0320 13:41:40.454655 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:41:40 crc kubenswrapper[4856]: I0320 13:41:40.484454 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" podStartSLOduration=2.453620343 podStartE2EDuration="6.484437161s" podCreationTimestamp="2026-03-20 13:41:34 +0000 UTC" firstStartedPulling="2026-03-20 13:41:35.212737247 +0000 UTC m=+1110.093763387" lastFinishedPulling="2026-03-20 13:41:39.243554045 +0000 UTC m=+1114.124580205" observedRunningTime="2026-03-20 13:41:40.484071052 +0000 UTC m=+1115.365097222" watchObservedRunningTime="2026-03-20 13:41:40.484437161 +0000 UTC m=+1115.365463291" Mar 20 13:41:44 crc kubenswrapper[4856]: I0320 13:41:44.925066 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-59b5998766-f2xcf" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.125193 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-8j2v4"] Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.127679 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.132916 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.133102 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.133122 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.140702 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-8j2v4"] Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.285582 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z55lb\" (UniqueName: \"kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb\") pod \"auto-csr-approver-29566902-8j2v4\" (UID: \"4b72b040-1c32-472d-b5e1-8ee3a7ace646\") " pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.386897 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z55lb\" (UniqueName: \"kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb\") pod \"auto-csr-approver-29566902-8j2v4\" (UID: \"4b72b040-1c32-472d-b5e1-8ee3a7ace646\") " pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.415389 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z55lb\" (UniqueName: \"kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb\") pod \"auto-csr-approver-29566902-8j2v4\" (UID: \"4b72b040-1c32-472d-b5e1-8ee3a7ace646\") " pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.457323 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:00 crc kubenswrapper[4856]: I0320 13:42:00.899233 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-8j2v4"] Mar 20 13:42:00 crc kubenswrapper[4856]: W0320 13:42:00.905746 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b72b040_1c32_472d_b5e1_8ee3a7ace646.slice/crio-164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28 WatchSource:0}: Error finding container 164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28: Status 404 returned error can't find the container with id 164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28 Mar 20 13:42:01 crc kubenswrapper[4856]: I0320 13:42:01.609342 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" event={"ID":"4b72b040-1c32-472d-b5e1-8ee3a7ace646","Type":"ContainerStarted","Data":"164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28"} Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.618135 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" event={"ID":"4b72b040-1c32-472d-b5e1-8ee3a7ace646","Type":"ContainerStarted","Data":"7ede2af770aa7660a0b1ba7d50f4622ef7d8f2f6c4c6d3481c5395c80c2a35bb"} Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.637106 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" podStartSLOduration=1.628622673 podStartE2EDuration="2.637089251s" podCreationTimestamp="2026-03-20 13:42:00 +0000 UTC" firstStartedPulling="2026-03-20 13:42:00.908152817 +0000 UTC m=+1135.789178947" lastFinishedPulling="2026-03-20 13:42:01.916619385 +0000 UTC m=+1136.797645525" observedRunningTime="2026-03-20 13:42:02.633039355 +0000 UTC m=+1137.514065495" watchObservedRunningTime="2026-03-20 13:42:02.637089251 +0000 UTC m=+1137.518115381" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.663285 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.663993 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.668137 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-5jzg9" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.675075 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.708029 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.709187 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.714011 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.714992 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.716837 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vkcmq" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.719462 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-djq4v" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.723412 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.724139 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.729838 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tc54q" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.732247 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngntm\" (UniqueName: \"kubernetes.io/projected/c22916c3-cf42-4583-8529-2f42a5780500-kube-api-access-ngntm\") pod \"barbican-operator-controller-manager-59bc569d95-h2vzr\" (UID: \"c22916c3-cf42-4583-8529-2f42a5780500\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.732400 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.733372 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.741368 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.741593 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vcxgm" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.745144 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.753964 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.761710 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.787339 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.788368 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.794314 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nl78x" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.818109 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.819619 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.830428 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.831368 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.833258 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngntm\" (UniqueName: \"kubernetes.io/projected/c22916c3-cf42-4583-8529-2f42a5780500-kube-api-access-ngntm\") pod \"barbican-operator-controller-manager-59bc569d95-h2vzr\" (UID: \"c22916c3-cf42-4583-8529-2f42a5780500\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.833410 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58hg\" (UniqueName: \"kubernetes.io/projected/122f071b-3f1d-4364-8142-466caeb29677-kube-api-access-p58hg\") pod \"cinder-operator-controller-manager-8d58dc466-v698k\" (UID: \"122f071b-3f1d-4364-8142-466caeb29677\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.833468 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88qs\" (UniqueName: \"kubernetes.io/projected/43236c8a-2018-4001-a8dc-67a9d4488f0a-kube-api-access-w88qs\") pod \"glance-operator-controller-manager-79df6bcc97-7xj74\" (UID: \"43236c8a-2018-4001-a8dc-67a9d4488f0a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.833511 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmdh\" (UniqueName: \"kubernetes.io/projected/602b2383-2c80-49b5-afa6-400c6022f0d6-kube-api-access-qkmdh\") pod \"designate-operator-controller-manager-588d4d986b-4qkfk\" (UID: \"602b2383-2c80-49b5-afa6-400c6022f0d6\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.833624 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdrwv\" (UniqueName: \"kubernetes.io/projected/60481295-8929-4cff-88c0-fc9645c555e6-kube-api-access-gdrwv\") pod \"heat-operator-controller-manager-67dd5f86f5-7bv99\" (UID: \"60481295-8929-4cff-88c0-fc9645c555e6\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.835564 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rj6xl" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.835893 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2kpqp" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.836106 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.846641 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.855913 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.890998 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.901899 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.903216 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.904309 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngntm\" (UniqueName: \"kubernetes.io/projected/c22916c3-cf42-4583-8529-2f42a5780500-kube-api-access-ngntm\") pod \"barbican-operator-controller-manager-59bc569d95-h2vzr\" (UID: \"c22916c3-cf42-4583-8529-2f42a5780500\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.912074 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tp44w" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.912386 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.917781 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.919139 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.934133 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-hwjlz" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.947650 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w"] Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.984467 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdrwv\" (UniqueName: \"kubernetes.io/projected/60481295-8929-4cff-88c0-fc9645c555e6-kube-api-access-gdrwv\") pod \"heat-operator-controller-manager-67dd5f86f5-7bv99\" (UID: \"60481295-8929-4cff-88c0-fc9645c555e6\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.984581 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j46l\" (UniqueName: \"kubernetes.io/projected/803de023-bc1c-42f0-899b-b7053081db3b-kube-api-access-6j46l\") pod \"horizon-operator-controller-manager-8464cc45fb-rf4db\" (UID: \"803de023-bc1c-42f0-899b-b7053081db3b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999511 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv5p6\" (UniqueName: \"kubernetes.io/projected/fee7d83a-7c59-4a95-85b3-8f677f068731-kube-api-access-dv5p6\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999657 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p58hg\" (UniqueName: \"kubernetes.io/projected/122f071b-3f1d-4364-8142-466caeb29677-kube-api-access-p58hg\") pod \"cinder-operator-controller-manager-8d58dc466-v698k\" (UID: \"122f071b-3f1d-4364-8142-466caeb29677\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999695 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999731 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88qs\" (UniqueName: \"kubernetes.io/projected/43236c8a-2018-4001-a8dc-67a9d4488f0a-kube-api-access-w88qs\") pod \"glance-operator-controller-manager-79df6bcc97-7xj74\" (UID: \"43236c8a-2018-4001-a8dc-67a9d4488f0a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999771 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95z6w\" (UniqueName: \"kubernetes.io/projected/4ca1fc8c-012a-4067-8ca1-ae2424a66b65-kube-api-access-95z6w\") pod \"ironic-operator-controller-manager-6f787dddc9-7zxs6\" (UID: \"4ca1fc8c-012a-4067-8ca1-ae2424a66b65\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:02 crc kubenswrapper[4856]: I0320 13:42:02.999832 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmdh\" (UniqueName: \"kubernetes.io/projected/602b2383-2c80-49b5-afa6-400c6022f0d6-kube-api-access-qkmdh\") pod \"designate-operator-controller-manager-588d4d986b-4qkfk\" (UID: \"602b2383-2c80-49b5-afa6-400c6022f0d6\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.005114 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.008386 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.015773 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dstsq" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.038305 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-95dk8"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.042176 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88qs\" (UniqueName: \"kubernetes.io/projected/43236c8a-2018-4001-a8dc-67a9d4488f0a-kube-api-access-w88qs\") pod \"glance-operator-controller-manager-79df6bcc97-7xj74\" (UID: \"43236c8a-2018-4001-a8dc-67a9d4488f0a\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.051477 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.056887 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdrwv\" (UniqueName: \"kubernetes.io/projected/60481295-8929-4cff-88c0-fc9645c555e6-kube-api-access-gdrwv\") pod \"heat-operator-controller-manager-67dd5f86f5-7bv99\" (UID: \"60481295-8929-4cff-88c0-fc9645c555e6\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.057133 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-h7vjf" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.057247 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.058467 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmdh\" (UniqueName: \"kubernetes.io/projected/602b2383-2c80-49b5-afa6-400c6022f0d6-kube-api-access-qkmdh\") pod \"designate-operator-controller-manager-588d4d986b-4qkfk\" (UID: \"602b2383-2c80-49b5-afa6-400c6022f0d6\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.062952 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-95dk8"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.072701 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.073418 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.086222 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p58hg\" (UniqueName: \"kubernetes.io/projected/122f071b-3f1d-4364-8142-466caeb29677-kube-api-access-p58hg\") pod \"cinder-operator-controller-manager-8d58dc466-v698k\" (UID: \"122f071b-3f1d-4364-8142-466caeb29677\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.086324 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.091765 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.095156 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101475 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mr5\" (UniqueName: \"kubernetes.io/projected/ef1eeee2-e51e-4771-934c-a4b0c9e4d949-kube-api-access-56mr5\") pod \"manila-operator-controller-manager-55f864c847-pv5l7\" (UID: \"ef1eeee2-e51e-4771-934c-a4b0c9e4d949\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101544 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j46l\" (UniqueName: \"kubernetes.io/projected/803de023-bc1c-42f0-899b-b7053081db3b-kube-api-access-6j46l\") pod \"horizon-operator-controller-manager-8464cc45fb-rf4db\" (UID: \"803de023-bc1c-42f0-899b-b7053081db3b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101582 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dv5p6\" (UniqueName: \"kubernetes.io/projected/fee7d83a-7c59-4a95-85b3-8f677f068731-kube-api-access-dv5p6\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101611 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cpl7\" (UniqueName: \"kubernetes.io/projected/16329126-6028-435b-b961-b483af84efc2-kube-api-access-2cpl7\") pod \"keystone-operator-controller-manager-768b96df4c-hwz76\" (UID: \"16329126-6028-435b-b961-b483af84efc2\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101661 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.101692 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95z6w\" (UniqueName: \"kubernetes.io/projected/4ca1fc8c-012a-4067-8ca1-ae2424a66b65-kube-api-access-95z6w\") pod \"ironic-operator-controller-manager-6f787dddc9-7zxs6\" (UID: \"4ca1fc8c-012a-4067-8ca1-ae2424a66b65\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.102517 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.102576 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:03.602558127 +0000 UTC m=+1138.483584267 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.108918 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-gqcnt" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.119351 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.120426 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.125871 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-tsqv4" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.126033 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.140426 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.142696 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j46l\" (UniqueName: \"kubernetes.io/projected/803de023-bc1c-42f0-899b-b7053081db3b-kube-api-access-6j46l\") pod \"horizon-operator-controller-manager-8464cc45fb-rf4db\" (UID: \"803de023-bc1c-42f0-899b-b7053081db3b\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.143701 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.144807 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.149030 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-95rk9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.158539 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.159589 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.161131 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f99ct" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.161646 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.176543 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv5p6\" (UniqueName: \"kubernetes.io/projected/fee7d83a-7c59-4a95-85b3-8f677f068731-kube-api-access-dv5p6\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.180060 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.181122 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.185359 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-b4sdg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.193911 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95z6w\" (UniqueName: \"kubernetes.io/projected/4ca1fc8c-012a-4067-8ca1-ae2424a66b65-kube-api-access-95z6w\") pod \"ironic-operator-controller-manager-6f787dddc9-7zxs6\" (UID: \"4ca1fc8c-012a-4067-8ca1-ae2424a66b65\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.196348 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.203193 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6x2d\" (UniqueName: \"kubernetes.io/projected/cfd501fb-ec8a-4b56-840f-975ed1184cd3-kube-api-access-b6x2d\") pod \"mariadb-operator-controller-manager-67ccfc9778-ckd6w\" (UID: \"cfd501fb-ec8a-4b56-840f-975ed1184cd3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.203260 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswbt\" (UniqueName: \"kubernetes.io/projected/ef7a6885-15ab-47ac-911f-5ef35b971f7f-kube-api-access-tswbt\") pod \"neutron-operator-controller-manager-767865f676-95dk8\" (UID: \"ef7a6885-15ab-47ac-911f-5ef35b971f7f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.203370 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zz2\" (UniqueName: \"kubernetes.io/projected/eb68c239-2237-493b-8943-597ad3822379-kube-api-access-q7zz2\") pod \"nova-operator-controller-manager-5d488d59fb-r6gkh\" (UID: \"eb68c239-2237-493b-8943-597ad3822379\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.203457 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56mr5\" (UniqueName: \"kubernetes.io/projected/ef1eeee2-e51e-4771-934c-a4b0c9e4d949-kube-api-access-56mr5\") pod \"manila-operator-controller-manager-55f864c847-pv5l7\" (UID: \"ef1eeee2-e51e-4771-934c-a4b0c9e4d949\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.203514 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cpl7\" (UniqueName: \"kubernetes.io/projected/16329126-6028-435b-b961-b483af84efc2-kube-api-access-2cpl7\") pod \"keystone-operator-controller-manager-768b96df4c-hwz76\" (UID: \"16329126-6028-435b-b961-b483af84efc2\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.223150 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56mr5\" (UniqueName: \"kubernetes.io/projected/ef1eeee2-e51e-4771-934c-a4b0c9e4d949-kube-api-access-56mr5\") pod \"manila-operator-controller-manager-55f864c847-pv5l7\" (UID: \"ef1eeee2-e51e-4771-934c-a4b0c9e4d949\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.225291 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.232917 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cpl7\" (UniqueName: \"kubernetes.io/projected/16329126-6028-435b-b961-b483af84efc2-kube-api-access-2cpl7\") pod \"keystone-operator-controller-manager-768b96df4c-hwz76\" (UID: \"16329126-6028-435b-b961-b483af84efc2\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.248783 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.271041 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.305967 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.315455 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.315612 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.316338 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.316749 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.317809 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkj52\" (UniqueName: \"kubernetes.io/projected/32083d25-90e1-4571-959b-629f6d8393a5-kube-api-access-wkj52\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.317865 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gldd6\" (UniqueName: \"kubernetes.io/projected/f62801cd-9d41-4312-a337-2e39d0bb1997-kube-api-access-gldd6\") pod \"placement-operator-controller-manager-5784578c99-s2jkg\" (UID: \"f62801cd-9d41-4312-a337-2e39d0bb1997\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.317921 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6x2d\" (UniqueName: \"kubernetes.io/projected/cfd501fb-ec8a-4b56-840f-975ed1184cd3-kube-api-access-b6x2d\") pod \"mariadb-operator-controller-manager-67ccfc9778-ckd6w\" (UID: \"cfd501fb-ec8a-4b56-840f-975ed1184cd3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.317944 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswbt\" (UniqueName: \"kubernetes.io/projected/ef7a6885-15ab-47ac-911f-5ef35b971f7f-kube-api-access-tswbt\") pod \"neutron-operator-controller-manager-767865f676-95dk8\" (UID: \"ef7a6885-15ab-47ac-911f-5ef35b971f7f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.317976 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7gdj\" (UniqueName: \"kubernetes.io/projected/52b4ae24-4743-4bea-aac3-a6a2fd4b1990-kube-api-access-g7gdj\") pod \"ovn-operator-controller-manager-884679f54-fr4gw\" (UID: \"52b4ae24-4743-4bea-aac3-a6a2fd4b1990\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.318001 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.318025 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zz2\" (UniqueName: \"kubernetes.io/projected/eb68c239-2237-493b-8943-597ad3822379-kube-api-access-q7zz2\") pod \"nova-operator-controller-manager-5d488d59fb-r6gkh\" (UID: \"eb68c239-2237-493b-8943-597ad3822379\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.318545 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.318984 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nwxcw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.319174 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bjqrd" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.321245 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6sdr\" (UniqueName: \"kubernetes.io/projected/74388a48-2f88-4093-a143-628de32ad98c-kube-api-access-p6sdr\") pod \"octavia-operator-controller-manager-5b9f45d989-58gr9\" (UID: \"74388a48-2f88-4093-a143-628de32ad98c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.326782 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.348285 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.349378 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.358360 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswbt\" (UniqueName: \"kubernetes.io/projected/ef7a6885-15ab-47ac-911f-5ef35b971f7f-kube-api-access-tswbt\") pod \"neutron-operator-controller-manager-767865f676-95dk8\" (UID: \"ef7a6885-15ab-47ac-911f-5ef35b971f7f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.362345 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6x2d\" (UniqueName: \"kubernetes.io/projected/cfd501fb-ec8a-4b56-840f-975ed1184cd3-kube-api-access-b6x2d\") pod \"mariadb-operator-controller-manager-67ccfc9778-ckd6w\" (UID: \"cfd501fb-ec8a-4b56-840f-975ed1184cd3\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.362444 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zz2\" (UniqueName: \"kubernetes.io/projected/eb68c239-2237-493b-8943-597ad3822379-kube-api-access-q7zz2\") pod \"nova-operator-controller-manager-5d488d59fb-r6gkh\" (UID: \"eb68c239-2237-493b-8943-597ad3822379\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.389535 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.390716 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.394031 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.404477 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-2wptx" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.410656 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.427870 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gldd6\" (UniqueName: \"kubernetes.io/projected/f62801cd-9d41-4312-a337-2e39d0bb1997-kube-api-access-gldd6\") pod \"placement-operator-controller-manager-5784578c99-s2jkg\" (UID: \"f62801cd-9d41-4312-a337-2e39d0bb1997\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428050 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7gdj\" (UniqueName: \"kubernetes.io/projected/52b4ae24-4743-4bea-aac3-a6a2fd4b1990-kube-api-access-g7gdj\") pod \"ovn-operator-controller-manager-884679f54-fr4gw\" (UID: \"52b4ae24-4743-4bea-aac3-a6a2fd4b1990\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428091 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428120 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcjb\" (UniqueName: \"kubernetes.io/projected/b3055bde-69b7-478d-8ddf-bb187b58e23e-kube-api-access-fpcjb\") pod \"swift-operator-controller-manager-c674c5965-xv6h2\" (UID: \"b3055bde-69b7-478d-8ddf-bb187b58e23e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428151 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x98np\" (UniqueName: \"kubernetes.io/projected/f6c2630c-bcb8-45a6-96ee-1cbe64b472ea-kube-api-access-x98np\") pod \"test-operator-controller-manager-5c5cb9c4d7-nwp96\" (UID: \"f6c2630c-bcb8-45a6-96ee-1cbe64b472ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428242 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6sdr\" (UniqueName: \"kubernetes.io/projected/74388a48-2f88-4093-a143-628de32ad98c-kube-api-access-p6sdr\") pod \"octavia-operator-controller-manager-5b9f45d989-58gr9\" (UID: \"74388a48-2f88-4093-a143-628de32ad98c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428403 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkj52\" (UniqueName: \"kubernetes.io/projected/32083d25-90e1-4571-959b-629f6d8393a5-kube-api-access-wkj52\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.428445 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr5xg\" (UniqueName: \"kubernetes.io/projected/625deb2f-8031-4fbd-93da-c6cfb29b5d9f-kube-api-access-hr5xg\") pod \"telemetry-operator-controller-manager-d6b694c5-2b2z6\" (UID: \"625deb2f-8031-4fbd-93da-c6cfb29b5d9f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.428965 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.429011 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:03.928994547 +0000 UTC m=+1138.810020677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.444465 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.459967 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.466803 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7gdj\" (UniqueName: \"kubernetes.io/projected/52b4ae24-4743-4bea-aac3-a6a2fd4b1990-kube-api-access-g7gdj\") pod \"ovn-operator-controller-manager-884679f54-fr4gw\" (UID: \"52b4ae24-4743-4bea-aac3-a6a2fd4b1990\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.467210 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gldd6\" (UniqueName: \"kubernetes.io/projected/f62801cd-9d41-4312-a337-2e39d0bb1997-kube-api-access-gldd6\") pod \"placement-operator-controller-manager-5784578c99-s2jkg\" (UID: \"f62801cd-9d41-4312-a337-2e39d0bb1997\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.470233 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkj52\" (UniqueName: \"kubernetes.io/projected/32083d25-90e1-4571-959b-629f6d8393a5-kube-api-access-wkj52\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.472991 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6sdr\" (UniqueName: \"kubernetes.io/projected/74388a48-2f88-4093-a143-628de32ad98c-kube-api-access-p6sdr\") pod \"octavia-operator-controller-manager-5b9f45d989-58gr9\" (UID: \"74388a48-2f88-4093-a143-628de32ad98c\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.524817 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.531420 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.536137 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.536597 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr5xg\" (UniqueName: \"kubernetes.io/projected/625deb2f-8031-4fbd-93da-c6cfb29b5d9f-kube-api-access-hr5xg\") pod \"telemetry-operator-controller-manager-d6b694c5-2b2z6\" (UID: \"625deb2f-8031-4fbd-93da-c6cfb29b5d9f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.536798 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcjb\" (UniqueName: \"kubernetes.io/projected/b3055bde-69b7-478d-8ddf-bb187b58e23e-kube-api-access-fpcjb\") pod \"swift-operator-controller-manager-c674c5965-xv6h2\" (UID: \"b3055bde-69b7-478d-8ddf-bb187b58e23e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.536824 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x98np\" (UniqueName: \"kubernetes.io/projected/f6c2630c-bcb8-45a6-96ee-1cbe64b472ea-kube-api-access-x98np\") pod \"test-operator-controller-manager-5c5cb9c4d7-nwp96\" (UID: \"f6c2630c-bcb8-45a6-96ee-1cbe64b472ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.541525 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.548228 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-l87dp" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.573534 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.580184 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcjb\" (UniqueName: \"kubernetes.io/projected/b3055bde-69b7-478d-8ddf-bb187b58e23e-kube-api-access-fpcjb\") pod \"swift-operator-controller-manager-c674c5965-xv6h2\" (UID: \"b3055bde-69b7-478d-8ddf-bb187b58e23e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.581260 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x98np\" (UniqueName: \"kubernetes.io/projected/f6c2630c-bcb8-45a6-96ee-1cbe64b472ea-kube-api-access-x98np\") pod \"test-operator-controller-manager-5c5cb9c4d7-nwp96\" (UID: \"f6c2630c-bcb8-45a6-96ee-1cbe64b472ea\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.584255 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr5xg\" (UniqueName: \"kubernetes.io/projected/625deb2f-8031-4fbd-93da-c6cfb29b5d9f-kube-api-access-hr5xg\") pod \"telemetry-operator-controller-manager-d6b694c5-2b2z6\" (UID: \"625deb2f-8031-4fbd-93da-c6cfb29b5d9f\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.585071 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.602183 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.632123 4856 generic.go:334] "Generic (PLEG): container finished" podID="4b72b040-1c32-472d-b5e1-8ee3a7ace646" containerID="7ede2af770aa7660a0b1ba7d50f4622ef7d8f2f6c4c6d3481c5395c80c2a35bb" exitCode=0 Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.632175 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" event={"ID":"4b72b040-1c32-472d-b5e1-8ee3a7ace646","Type":"ContainerDied","Data":"7ede2af770aa7660a0b1ba7d50f4622ef7d8f2f6c4c6d3481c5395c80c2a35bb"} Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.639075 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.639397 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.639478 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.639456278 +0000 UTC m=+1139.520482418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.665849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.684382 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.685713 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.689664 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.690666 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.691768 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-c6ff6" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.691869 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.691950 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.709242 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.721448 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.740208 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdpkv\" (UniqueName: \"kubernetes.io/projected/e75213b1-7eab-451f-bca1-4f38db805ba7-kube-api-access-mdpkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w6tnk\" (UID: \"e75213b1-7eab-451f-bca1-4f38db805ba7\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.744465 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.841322 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdpkv\" (UniqueName: \"kubernetes.io/projected/e75213b1-7eab-451f-bca1-4f38db805ba7-kube-api-access-mdpkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w6tnk\" (UID: \"e75213b1-7eab-451f-bca1-4f38db805ba7\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.841385 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.841406 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/78560b1b-78fa-4282-a6c3-a06306ab470c-kube-api-access-pnzql\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.841424 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.866159 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdpkv\" (UniqueName: \"kubernetes.io/projected/e75213b1-7eab-451f-bca1-4f38db805ba7-kube-api-access-mdpkv\") pod \"watcher-operator-controller-manager-6c4d75f7f9-w6tnk\" (UID: \"e75213b1-7eab-451f-bca1-4f38db805ba7\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.902117 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.939771 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.943206 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.943256 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.943302 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/78560b1b-78fa-4282-a6c3-a06306ab470c-kube-api-access-pnzql\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.943366 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943555 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943624 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.943605622 +0000 UTC m=+1139.824631752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943707 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943801 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.443775317 +0000 UTC m=+1139.324801447 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943865 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: E0320 13:42:03.943898 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.44388687 +0000 UTC m=+1139.324913110 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.960042 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99"] Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.970174 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnzql\" (UniqueName: \"kubernetes.io/projected/78560b1b-78fa-4282-a6c3-a06306ab470c-kube-api-access-pnzql\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:03 crc kubenswrapper[4856]: I0320 13:42:03.983118 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76"] Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.002517 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16329126_6028_435b_b961_b483af84efc2.slice/crio-85549a319cd390b6dd50acfcc04c10ebed52b021440916b40459801fee6658bf WatchSource:0}: Error finding container 85549a319cd390b6dd50acfcc04c10ebed52b021440916b40459801fee6658bf: Status 404 returned error can't find the container with id 85549a319cd390b6dd50acfcc04c10ebed52b021440916b40459801fee6658bf Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.175637 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.430641 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.468429 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.468637 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.468717 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.468689855 +0000 UTC m=+1140.349715985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.468832 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.468911 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.46888829 +0000 UTC m=+1140.349914470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.468516 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.490414 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803de023_bc1c_42f0_899b_b7053081db3b.slice/crio-ae62be020f7cd2ab9f6a011233f7e869a023bada5f0e5828394deb31b0923f99 WatchSource:0}: Error finding container ae62be020f7cd2ab9f6a011233f7e869a023bada5f0e5828394deb31b0923f99: Status 404 returned error can't find the container with id ae62be020f7cd2ab9f6a011233f7e869a023bada5f0e5828394deb31b0923f99 Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.497141 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk"] Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.502677 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod602b2383_2c80_49b5_afa6_400c6022f0d6.slice/crio-252c13644100d4b31cedd54b59afe88a45974eece5e63567fc5b689dda03a372 WatchSource:0}: Error finding container 252c13644100d4b31cedd54b59afe88a45974eece5e63567fc5b689dda03a372: Status 404 returned error can't find the container with id 252c13644100d4b31cedd54b59afe88a45974eece5e63567fc5b689dda03a372 Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.650826 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.657016 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" event={"ID":"ef1eeee2-e51e-4771-934c-a4b0c9e4d949","Type":"ContainerStarted","Data":"222e972e6c426a70d723ba2de218daed71c3da9d600c74a3f0e69a8339141d57"} Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.658694 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" event={"ID":"c22916c3-cf42-4583-8529-2f42a5780500","Type":"ContainerStarted","Data":"421b41443ef9bc60b85b51b96a76db3638728dbd670d29d1d605e87014354338"} Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.660962 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" event={"ID":"16329126-6028-435b-b961-b483af84efc2","Type":"ContainerStarted","Data":"85549a319cd390b6dd50acfcc04c10ebed52b021440916b40459801fee6658bf"} Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.661151 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.662898 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" event={"ID":"43236c8a-2018-4001-a8dc-67a9d4488f0a","Type":"ContainerStarted","Data":"3008d924b6c3d666c42e80a2f1a33d93c42e12971e568506a422d1cc8562031d"} Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.663832 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod122f071b_3f1d_4364_8142_466caeb29677.slice/crio-c3bdfbb08b0f3175d2609e90cdd50d7b427efa416d352161f4cfab780e0d4c58 WatchSource:0}: Error finding container c3bdfbb08b0f3175d2609e90cdd50d7b427efa416d352161f4cfab780e0d4c58: Status 404 returned error can't find the container with id c3bdfbb08b0f3175d2609e90cdd50d7b427efa416d352161f4cfab780e0d4c58 Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.663906 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" event={"ID":"60481295-8929-4cff-88c0-fc9645c555e6","Type":"ContainerStarted","Data":"02c8bff2d579c36232c83cae04c49a78750222021e3e5e93fd138c3829aa11a9"} Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.664822 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" event={"ID":"602b2383-2c80-49b5-afa6-400c6022f0d6","Type":"ContainerStarted","Data":"252c13644100d4b31cedd54b59afe88a45974eece5e63567fc5b689dda03a372"} Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.665817 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" event={"ID":"803de023-bc1c-42f0-899b-b7053081db3b","Type":"ContainerStarted","Data":"ae62be020f7cd2ab9f6a011233f7e869a023bada5f0e5828394deb31b0923f99"} Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.667345 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb68c239_2237_493b_8943_597ad3822379.slice/crio-f8c8f07ae46af352cd11c784b6894de9c3b2cc6870d5572d839456c0eb678230 WatchSource:0}: Error finding container f8c8f07ae46af352cd11c784b6894de9c3b2cc6870d5572d839456c0eb678230: Status 404 returned error can't find the container with id f8c8f07ae46af352cd11c784b6894de9c3b2cc6870d5572d839456c0eb678230 Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.675057 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.675281 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.675347 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:06.675330727 +0000 UTC m=+1141.556356857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.838355 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.844751 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.851064 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6"] Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.858663 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b4ae24_4743_4bea_aac3_a6a2fd4b1990.slice/crio-fc93e485e7bc430ab70903c9d5343af98f4916009c7d409aaa7b42bf74a46caf WatchSource:0}: Error finding container fc93e485e7bc430ab70903c9d5343af98f4916009c7d409aaa7b42bf74a46caf: Status 404 returned error can't find the container with id fc93e485e7bc430ab70903c9d5343af98f4916009c7d409aaa7b42bf74a46caf Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.865847 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.871240 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg"] Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.884156 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-95dk8"] Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.892977 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf62801cd_9d41_4312_a337_2e39d0bb1997.slice/crio-51359fae452bc743d339da4106015d5b030ae8489614840272e616e524159cb1 WatchSource:0}: Error finding container 51359fae452bc743d339da4106015d5b030ae8489614840272e616e524159cb1: Status 404 returned error can't find the container with id 51359fae452bc743d339da4106015d5b030ae8489614840272e616e524159cb1 Mar 20 13:42:04 crc kubenswrapper[4856]: W0320 13:42:04.911183 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef7a6885_15ab_47ac_911f_5ef35b971f7f.slice/crio-1b154966184cc056ac194c80d9c5eb938cd73cad22fc839e2c5547cb26355e49 WatchSource:0}: Error finding container 1b154966184cc056ac194c80d9c5eb938cd73cad22fc839e2c5547cb26355e49: Status 404 returned error can't find the container with id 1b154966184cc056ac194c80d9c5eb938cd73cad22fc839e2c5547cb26355e49 Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.933362 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.987146 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z55lb\" (UniqueName: \"kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb\") pod \"4b72b040-1c32-472d-b5e1-8ee3a7ace646\" (UID: \"4b72b040-1c32-472d-b5e1-8ee3a7ace646\") " Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.987468 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.987604 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: E0320 13:42:04.987652 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:06.987636196 +0000 UTC m=+1141.868662326 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4856]: I0320 13:42:04.994142 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb" (OuterVolumeSpecName: "kube-api-access-z55lb") pod "4b72b040-1c32-472d-b5e1-8ee3a7ace646" (UID: "4b72b040-1c32-472d-b5e1-8ee3a7ace646"). InnerVolumeSpecName "kube-api-access-z55lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.046736 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk"] Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.090603 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z55lb\" (UniqueName: \"kubernetes.io/projected/4b72b040-1c32-472d-b5e1-8ee3a7ace646-kube-api-access-z55lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.100229 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mdpkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-w6tnk_openstack-operators(e75213b1-7eab-451f-bca1-4f38db805ba7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.101719 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" podUID="e75213b1-7eab-451f-bca1-4f38db805ba7" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.106054 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fpcjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-xv6h2_openstack-operators(b3055bde-69b7-478d-8ddf-bb187b58e23e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.108122 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" podUID="b3055bde-69b7-478d-8ddf-bb187b58e23e" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.109975 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2"] Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.110236 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b6x2d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-ckd6w_openstack-operators(cfd501fb-ec8a-4b56-840f-975ed1184cd3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.112641 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" podUID="cfd501fb-ec8a-4b56-840f-975ed1184cd3" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.119247 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9"] Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.157534 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w"] Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.496361 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.496431 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.496641 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.496720 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:07.496700076 +0000 UTC m=+1142.377726196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.498435 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.498572 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:07.498543575 +0000 UTC m=+1142.379569705 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.675565 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" event={"ID":"ef7a6885-15ab-47ac-911f-5ef35b971f7f","Type":"ContainerStarted","Data":"1b154966184cc056ac194c80d9c5eb938cd73cad22fc839e2c5547cb26355e49"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.676872 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" event={"ID":"b3055bde-69b7-478d-8ddf-bb187b58e23e","Type":"ContainerStarted","Data":"4783d301b24f38293e969de90a244f11df479b1c0850c3119799860d5734bdb2"} Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.680510 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" podUID="b3055bde-69b7-478d-8ddf-bb187b58e23e" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.688682 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" event={"ID":"74388a48-2f88-4093-a143-628de32ad98c","Type":"ContainerStarted","Data":"db2d8cb44f8b8dd9e9d3df864a8afadd2310264180b81594162fc0a26b7023e6"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.691723 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" event={"ID":"e75213b1-7eab-451f-bca1-4f38db805ba7","Type":"ContainerStarted","Data":"df26a74174f47909816b11285c0765a5161f57bac782f90b0a7eadf1e3b490df"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.697224 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" event={"ID":"52b4ae24-4743-4bea-aac3-a6a2fd4b1990","Type":"ContainerStarted","Data":"fc93e485e7bc430ab70903c9d5343af98f4916009c7d409aaa7b42bf74a46caf"} Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.699195 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" podUID="e75213b1-7eab-451f-bca1-4f38db805ba7" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.711317 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-9gch9"] Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.726028 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" event={"ID":"f62801cd-9d41-4312-a337-2e39d0bb1997","Type":"ContainerStarted","Data":"51359fae452bc743d339da4106015d5b030ae8489614840272e616e524159cb1"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.727858 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" event={"ID":"cfd501fb-ec8a-4b56-840f-975ed1184cd3","Type":"ContainerStarted","Data":"2777d928a5c2840f84523fe84029ffbeaf27e7382b11e84d6332560fb12d338f"} Mar 20 13:42:05 crc kubenswrapper[4856]: E0320 13:42:05.730163 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" podUID="cfd501fb-ec8a-4b56-840f-975ed1184cd3" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.731207 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" event={"ID":"4b72b040-1c32-472d-b5e1-8ee3a7ace646","Type":"ContainerDied","Data":"164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.731232 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="164dadf115e3ee6b0f681c30163f74b0493e16cf691c47e6f2b284a4153aaf28" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.731364 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-8j2v4" Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.735237 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-9gch9"] Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.743133 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" event={"ID":"625deb2f-8031-4fbd-93da-c6cfb29b5d9f","Type":"ContainerStarted","Data":"8e9ad522030f6101c07e25ec1945ef2bd6548fc6d58b35a9a150a2ab71ba99d8"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.744992 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" event={"ID":"eb68c239-2237-493b-8943-597ad3822379","Type":"ContainerStarted","Data":"f8c8f07ae46af352cd11c784b6894de9c3b2cc6870d5572d839456c0eb678230"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.747236 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" event={"ID":"122f071b-3f1d-4364-8142-466caeb29677","Type":"ContainerStarted","Data":"c3bdfbb08b0f3175d2609e90cdd50d7b427efa416d352161f4cfab780e0d4c58"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.750543 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" event={"ID":"4ca1fc8c-012a-4067-8ca1-ae2424a66b65","Type":"ContainerStarted","Data":"a8e756f5d305e3bb2b999f315c2ad5ce58ca239204fd6b19b21d0c81c3190eb6"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.755456 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" event={"ID":"f6c2630c-bcb8-45a6-96ee-1cbe64b472ea","Type":"ContainerStarted","Data":"f99e16aa0a9348d3261a5f80e94836f0317f305a1d1d9a03a1102418d1eda782"} Mar 20 13:42:05 crc kubenswrapper[4856]: I0320 13:42:05.840927 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb91f950-617b-4179-b3b0-b03c74f45ce4" path="/var/lib/kubelet/pods/eb91f950-617b-4179-b3b0-b03c74f45ce4/volumes" Mar 20 13:42:06 crc kubenswrapper[4856]: I0320 13:42:06.717077 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:06 crc kubenswrapper[4856]: E0320 13:42:06.717484 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:06 crc kubenswrapper[4856]: E0320 13:42:06.717547 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:10.717529174 +0000 UTC m=+1145.598555304 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:06 crc kubenswrapper[4856]: E0320 13:42:06.766525 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" podUID="e75213b1-7eab-451f-bca1-4f38db805ba7" Mar 20 13:42:06 crc kubenswrapper[4856]: E0320 13:42:06.766671 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" podUID="cfd501fb-ec8a-4b56-840f-975ed1184cd3" Mar 20 13:42:06 crc kubenswrapper[4856]: E0320 13:42:06.766744 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" podUID="b3055bde-69b7-478d-8ddf-bb187b58e23e" Mar 20 13:42:07 crc kubenswrapper[4856]: I0320 13:42:07.024328 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.024684 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.024745 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:11.024726599 +0000 UTC m=+1145.905752739 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4856]: I0320 13:42:07.536257 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:07 crc kubenswrapper[4856]: I0320 13:42:07.536771 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.536564 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.537173 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:11.537150988 +0000 UTC m=+1146.418177118 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.537070 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4856]: E0320 13:42:07.537841 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:11.537828345 +0000 UTC m=+1146.418854475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:09 crc kubenswrapper[4856]: I0320 13:42:09.986935 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:42:09 crc kubenswrapper[4856]: I0320 13:42:09.987215 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:09 crc kubenswrapper[4856]: I0320 13:42:09.987252 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:42:09 crc kubenswrapper[4856]: I0320 13:42:09.987775 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:42:09 crc kubenswrapper[4856]: I0320 13:42:09.987824 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129" gracePeriod=600 Mar 20 13:42:10 crc kubenswrapper[4856]: I0320 13:42:10.790392 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:10 crc kubenswrapper[4856]: E0320 13:42:10.790601 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:10 crc kubenswrapper[4856]: E0320 13:42:10.790710 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:18.790686086 +0000 UTC m=+1153.671712296 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.094485 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.094839 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.094929 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:19.094906292 +0000 UTC m=+1153.975932422 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.602357 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.602576 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.602775 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.602715 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.602803 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:19.60275776 +0000 UTC m=+1154.483783890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: E0320 13:42:11.602953 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:19.602915655 +0000 UTC m=+1154.483941985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.806763 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129" exitCode=0 Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.806820 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129"} Mar 20 13:42:11 crc kubenswrapper[4856]: I0320 13:42:11.806856 4856 scope.go:117] "RemoveContainer" containerID="4e7ce4a794c1e043feffc1d2bcf679a326c64ff079ad613e919bb36d03e9d4c3" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.203741 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.206172 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gldd6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-s2jkg_openstack-operators(f62801cd-9d41-4312-a337-2e39d0bb1997): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.208124 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" podUID="f62801cd-9d41-4312-a337-2e39d0bb1997" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.728447 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.729028 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-p58hg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-v698k_openstack-operators(122f071b-3f1d-4364-8142-466caeb29677): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.730370 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" podUID="122f071b-3f1d-4364-8142-466caeb29677" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.859932 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" podUID="122f071b-3f1d-4364-8142-466caeb29677" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.861223 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" podUID="f62801cd-9d41-4312-a337-2e39d0bb1997" Mar 20 13:42:18 crc kubenswrapper[4856]: I0320 13:42:18.887614 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.887815 4856 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:18 crc kubenswrapper[4856]: E0320 13:42:18.887888 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert podName:fee7d83a-7c59-4a95-85b3-8f677f068731 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:34.887873296 +0000 UTC m=+1169.768899426 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert") pod "infra-operator-controller-manager-669fff9c7c-n45pl" (UID: "fee7d83a-7c59-4a95-85b3-8f677f068731") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: I0320 13:42:19.191905 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.192121 4856 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.192179 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert podName:32083d25-90e1-4571-959b-629f6d8393a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:35.192165014 +0000 UTC m=+1170.073191144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" (UID: "32083d25-90e1-4571-959b-629f6d8393a5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.235256 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.235559 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g7gdj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-fr4gw_openstack-operators(52b4ae24-4743-4bea-aac3-a6a2fd4b1990): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.237349 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" podUID="52b4ae24-4743-4bea-aac3-a6a2fd4b1990" Mar 20 13:42:19 crc kubenswrapper[4856]: I0320 13:42:19.699955 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:19 crc kubenswrapper[4856]: I0320 13:42:19.700008 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.700161 4856 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.700175 4856 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.700342 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:35.700221718 +0000 UTC m=+1170.581247848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "webhook-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.700364 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs podName:78560b1b-78fa-4282-a6c3-a06306ab470c nodeName:}" failed. No retries permitted until 2026-03-20 13:42:35.700355582 +0000 UTC m=+1170.581381702 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs") pod "openstack-operator-controller-manager-85d5885774-8ws6b" (UID: "78560b1b-78fa-4282-a6c3-a06306ab470c") : secret "metrics-server-cert" not found Mar 20 13:42:19 crc kubenswrapper[4856]: E0320 13:42:19.864566 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" podUID="52b4ae24-4743-4bea-aac3-a6a2fd4b1990" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.130034 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.130183 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hr5xg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-2b2z6_openstack-operators(625deb2f-8031-4fbd-93da-c6cfb29b5d9f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.131459 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" podUID="625deb2f-8031-4fbd-93da-c6cfb29b5d9f" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.586090 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.586301 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkmdh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-4qkfk_openstack-operators(602b2383-2c80-49b5-afa6-400c6022f0d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.587552 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" podUID="602b2383-2c80-49b5-afa6-400c6022f0d6" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.870980 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" podUID="602b2383-2c80-49b5-afa6-400c6022f0d6" Mar 20 13:42:20 crc kubenswrapper[4856]: E0320 13:42:20.871008 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" podUID="625deb2f-8031-4fbd-93da-c6cfb29b5d9f" Mar 20 13:42:21 crc kubenswrapper[4856]: E0320 13:42:21.102840 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 13:42:21 crc kubenswrapper[4856]: E0320 13:42:21.103019 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q7zz2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-r6gkh_openstack-operators(eb68c239-2237-493b-8943-597ad3822379): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:21 crc kubenswrapper[4856]: E0320 13:42:21.104795 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" podUID="eb68c239-2237-493b-8943-597ad3822379" Mar 20 13:42:21 crc kubenswrapper[4856]: I0320 13:42:21.457989 4856 scope.go:117] "RemoveContainer" containerID="2e4c11104392346203a0b7b47f90c927685ddee9037e983ba4a0f25e7cadd256" Mar 20 13:42:21 crc kubenswrapper[4856]: E0320 13:42:21.896036 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" podUID="eb68c239-2237-493b-8943-597ad3822379" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.906885 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.911074 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" event={"ID":"803de023-bc1c-42f0-899b-b7053081db3b","Type":"ContainerStarted","Data":"3f107ec5c53361e0313e1388a5505716594f09ce004f992f5a8fc81f8b6bdebc"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.911218 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.913589 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" event={"ID":"e75213b1-7eab-451f-bca1-4f38db805ba7","Type":"ContainerStarted","Data":"030ecac5d8ad81d5278b17382260a5b6e0d2974cba0289aef39535dd82dd973f"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.914310 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.919260 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" event={"ID":"43236c8a-2018-4001-a8dc-67a9d4488f0a","Type":"ContainerStarted","Data":"ca8de10a6cb24c5c47e4e809b7261317efb2a399208c32908cb5a93b5f51c4e2"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.919957 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.922039 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" event={"ID":"ef7a6885-15ab-47ac-911f-5ef35b971f7f","Type":"ContainerStarted","Data":"0034aff18560d0289e7f8a556f9ff06f81e3bf4b620d971bde0e2783532d2956"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.922540 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.935840 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" event={"ID":"4ca1fc8c-012a-4067-8ca1-ae2424a66b65","Type":"ContainerStarted","Data":"9ddc8aa219fbfcaf3f84e3ee84380eb40e98044fb4cb8f7403e2be03a6186d82"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.936137 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.941238 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" event={"ID":"f6c2630c-bcb8-45a6-96ee-1cbe64b472ea","Type":"ContainerStarted","Data":"47c0a2e150a70ad582783413adee6b4233bdbeedbaf1ffabeb442119266ea2c1"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.941491 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.942586 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" event={"ID":"16329126-6028-435b-b961-b483af84efc2","Type":"ContainerStarted","Data":"d8ff389a21345cd0617c11e4a70e05cf7c159bf988e26234d3c85a540454fa55"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.943367 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.945237 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" event={"ID":"60481295-8929-4cff-88c0-fc9645c555e6","Type":"ContainerStarted","Data":"cc7e36cf229b086e4bafc4b7e23993b9f5bd1254fe27962c5619bdcf52f78d61"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.945826 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.947383 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" event={"ID":"ef1eeee2-e51e-4771-934c-a4b0c9e4d949","Type":"ContainerStarted","Data":"da6ad1cbf1d6755c56ea84f80e1e8708645af9de25bf84f94f524b0a3411dcfd"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.947914 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.951520 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" podStartSLOduration=2.887151059 podStartE2EDuration="19.951504246s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.099585538 +0000 UTC m=+1139.980611678" lastFinishedPulling="2026-03-20 13:42:22.163938735 +0000 UTC m=+1157.044964865" observedRunningTime="2026-03-20 13:42:22.950335006 +0000 UTC m=+1157.831361156" watchObservedRunningTime="2026-03-20 13:42:22.951504246 +0000 UTC m=+1157.832530376" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.957132 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" event={"ID":"74388a48-2f88-4093-a143-628de32ad98c","Type":"ContainerStarted","Data":"8cd6aa4db6961ba02e3af7ae0d44607f202f944c0490f545254bcece2417e6eb"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.957689 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.967128 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" event={"ID":"c22916c3-cf42-4583-8529-2f42a5780500","Type":"ContainerStarted","Data":"0915465a2f1aa3ef9cda863ce99957b2c1401ad4c88559e2e9feba57761b0f2a"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.967759 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.979111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" event={"ID":"cfd501fb-ec8a-4b56-840f-975ed1184cd3","Type":"ContainerStarted","Data":"7d6f8d8096fa97d560ed17ba0960390d898fc5f941fcc30ced494ff48e9d83a2"} Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.979838 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:22 crc kubenswrapper[4856]: I0320 13:42:22.997119 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" podStartSLOduration=3.892413735 podStartE2EDuration="20.997098004s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:03.999128192 +0000 UTC m=+1138.880154322" lastFinishedPulling="2026-03-20 13:42:21.103812471 +0000 UTC m=+1155.984838591" observedRunningTime="2026-03-20 13:42:22.99276278 +0000 UTC m=+1157.873788910" watchObservedRunningTime="2026-03-20 13:42:22.997098004 +0000 UTC m=+1157.878124124" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.039650 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" podStartSLOduration=4.850788848 podStartE2EDuration="21.039632973s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.915010537 +0000 UTC m=+1139.796036667" lastFinishedPulling="2026-03-20 13:42:21.103854662 +0000 UTC m=+1155.984880792" observedRunningTime="2026-03-20 13:42:23.027243037 +0000 UTC m=+1157.908269167" watchObservedRunningTime="2026-03-20 13:42:23.039632973 +0000 UTC m=+1157.920659103" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.126996 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" podStartSLOduration=4.517637421 podStartE2EDuration="21.126974178s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.494784621 +0000 UTC m=+1139.375810751" lastFinishedPulling="2026-03-20 13:42:21.104121388 +0000 UTC m=+1155.985147508" observedRunningTime="2026-03-20 13:42:23.121602017 +0000 UTC m=+1158.002628167" watchObservedRunningTime="2026-03-20 13:42:23.126974178 +0000 UTC m=+1158.008000308" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.253902 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" podStartSLOduration=4.158054539 podStartE2EDuration="21.253882414s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.110103124 +0000 UTC m=+1139.991129254" lastFinishedPulling="2026-03-20 13:42:22.205930979 +0000 UTC m=+1157.086957129" observedRunningTime="2026-03-20 13:42:23.253369981 +0000 UTC m=+1158.134396121" watchObservedRunningTime="2026-03-20 13:42:23.253882414 +0000 UTC m=+1158.134908544" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.257544 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" podStartSLOduration=4.018359566 podStartE2EDuration="21.25753149s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:03.865405337 +0000 UTC m=+1138.746431467" lastFinishedPulling="2026-03-20 13:42:21.104577261 +0000 UTC m=+1155.985603391" observedRunningTime="2026-03-20 13:42:23.206510909 +0000 UTC m=+1158.087537039" watchObservedRunningTime="2026-03-20 13:42:23.25753149 +0000 UTC m=+1158.138557620" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.315680 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" podStartSLOduration=4.699944853 podStartE2EDuration="21.315645938s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.902153499 +0000 UTC m=+1139.783179629" lastFinishedPulling="2026-03-20 13:42:21.517854584 +0000 UTC m=+1156.398880714" observedRunningTime="2026-03-20 13:42:23.307954945 +0000 UTC m=+1158.188981075" watchObservedRunningTime="2026-03-20 13:42:23.315645938 +0000 UTC m=+1158.196672068" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.363065 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" podStartSLOduration=4.5819068 podStartE2EDuration="21.363046184s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.32278863 +0000 UTC m=+1139.203814760" lastFinishedPulling="2026-03-20 13:42:21.103928014 +0000 UTC m=+1155.984954144" observedRunningTime="2026-03-20 13:42:23.358187986 +0000 UTC m=+1158.239214136" watchObservedRunningTime="2026-03-20 13:42:23.363046184 +0000 UTC m=+1158.244072304" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.386186 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" podStartSLOduration=4.9511294150000005 podStartE2EDuration="21.386165871s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.082486439 +0000 UTC m=+1139.963512559" lastFinishedPulling="2026-03-20 13:42:21.517522885 +0000 UTC m=+1156.398549015" observedRunningTime="2026-03-20 13:42:23.384020065 +0000 UTC m=+1158.265046195" watchObservedRunningTime="2026-03-20 13:42:23.386165871 +0000 UTC m=+1158.267192001" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.434148 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" podStartSLOduration=4.35176839 podStartE2EDuration="21.434131232s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.022992149 +0000 UTC m=+1138.904018279" lastFinishedPulling="2026-03-20 13:42:21.105354981 +0000 UTC m=+1155.986381121" observedRunningTime="2026-03-20 13:42:23.430021294 +0000 UTC m=+1158.311047434" watchObservedRunningTime="2026-03-20 13:42:23.434131232 +0000 UTC m=+1158.315157362" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.457313 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" podStartSLOduration=4.349764577 podStartE2EDuration="21.457294171s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:03.997091238 +0000 UTC m=+1138.878117368" lastFinishedPulling="2026-03-20 13:42:21.104620832 +0000 UTC m=+1155.985646962" observedRunningTime="2026-03-20 13:42:23.451701024 +0000 UTC m=+1158.332727144" watchObservedRunningTime="2026-03-20 13:42:23.457294171 +0000 UTC m=+1158.338320311" Mar 20 13:42:23 crc kubenswrapper[4856]: I0320 13:42:23.484578 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" podStartSLOduration=3.875120817 podStartE2EDuration="20.484525837s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.909234325 +0000 UTC m=+1139.790260455" lastFinishedPulling="2026-03-20 13:42:21.518639345 +0000 UTC m=+1156.399665475" observedRunningTime="2026-03-20 13:42:23.481215669 +0000 UTC m=+1158.362241829" watchObservedRunningTime="2026-03-20 13:42:23.484525837 +0000 UTC m=+1158.365551967" Mar 20 13:42:26 crc kubenswrapper[4856]: I0320 13:42:26.002211 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" event={"ID":"b3055bde-69b7-478d-8ddf-bb187b58e23e","Type":"ContainerStarted","Data":"e8b7fcca0c6b4b14032fa2cbcfb6124e6a0636c1e646c47c7c66d77dae639c31"} Mar 20 13:42:26 crc kubenswrapper[4856]: I0320 13:42:26.003036 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:26 crc kubenswrapper[4856]: I0320 13:42:26.024200 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" podStartSLOduration=3.116435345 podStartE2EDuration="23.02417772s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.105829462 +0000 UTC m=+1139.986855592" lastFinishedPulling="2026-03-20 13:42:25.013571827 +0000 UTC m=+1159.894597967" observedRunningTime="2026-03-20 13:42:26.01808368 +0000 UTC m=+1160.899109840" watchObservedRunningTime="2026-03-20 13:42:26.02417772 +0000 UTC m=+1160.905203860" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.011168 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-h2vzr" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.056556 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" event={"ID":"602b2383-2c80-49b5-afa6-400c6022f0d6","Type":"ContainerStarted","Data":"a94c7fbe9bcb7f4722ebfcb535355534506988e12ea8edc2a65d442a1be28d69"} Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.057156 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.058754 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" event={"ID":"625deb2f-8031-4fbd-93da-c6cfb29b5d9f","Type":"ContainerStarted","Data":"df7223decd83b26b92b492523a9681da0bea73b79a09cdb09a5c427f26260436"} Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.059419 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.060296 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-7xj74" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.061164 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" event={"ID":"122f071b-3f1d-4364-8142-466caeb29677","Type":"ContainerStarted","Data":"936e326d287e6708eacee2d1671029e0d390e79d3cb68e2843cc53854cfca5e2"} Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.061794 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.068319 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" event={"ID":"52b4ae24-4743-4bea-aac3-a6a2fd4b1990","Type":"ContainerStarted","Data":"8e23ec7463cc88927d468f764847113c9cc471cace50e0b795b84c8fd3ac9ac6"} Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.068690 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.080994 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-7bv99" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.086111 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" podStartSLOduration=3.085258187 podStartE2EDuration="31.086089289s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.505618286 +0000 UTC m=+1139.386644416" lastFinishedPulling="2026-03-20 13:42:32.506449388 +0000 UTC m=+1167.387475518" observedRunningTime="2026-03-20 13:42:33.08319371 +0000 UTC m=+1167.964219850" watchObservedRunningTime="2026-03-20 13:42:33.086089289 +0000 UTC m=+1167.967115419" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.103330 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" podStartSLOduration=3.6665466799999997 podStartE2EDuration="31.10330637s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.865308261 +0000 UTC m=+1139.746334391" lastFinishedPulling="2026-03-20 13:42:32.302067951 +0000 UTC m=+1167.183094081" observedRunningTime="2026-03-20 13:42:33.099117226 +0000 UTC m=+1167.980143356" watchObservedRunningTime="2026-03-20 13:42:33.10330637 +0000 UTC m=+1167.984332520" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.154730 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" podStartSLOduration=3.241304203 podStartE2EDuration="31.154713528s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.667414039 +0000 UTC m=+1139.548440169" lastFinishedPulling="2026-03-20 13:42:32.580823364 +0000 UTC m=+1167.461849494" observedRunningTime="2026-03-20 13:42:33.150485903 +0000 UTC m=+1168.031512033" watchObservedRunningTime="2026-03-20 13:42:33.154713528 +0000 UTC m=+1168.035739658" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.168464 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" podStartSLOduration=2.748079734 podStartE2EDuration="30.168448345s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.882744979 +0000 UTC m=+1139.763771109" lastFinishedPulling="2026-03-20 13:42:32.30311359 +0000 UTC m=+1167.184139720" observedRunningTime="2026-03-20 13:42:33.166042928 +0000 UTC m=+1168.047069068" watchObservedRunningTime="2026-03-20 13:42:33.168448345 +0000 UTC m=+1168.049474475" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.252615 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-hwz76" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.318248 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-pv5l7" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.403094 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-ckd6w" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.423853 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-rf4db" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.456384 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-7zxs6" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.527605 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-95dk8" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.581037 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-58gr9" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.724206 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-xv6h2" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.747695 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-nwp96" Mar 20 13:42:33 crc kubenswrapper[4856]: I0320 13:42:33.905195 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-w6tnk" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.083601 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" event={"ID":"eb68c239-2237-493b-8943-597ad3822379","Type":"ContainerStarted","Data":"3ab44322904868a13df370634280a442caa93bf7a93e4ef8138de478e98acd78"} Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.084406 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.101553 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" podStartSLOduration=3.553344585 podStartE2EDuration="32.101535084s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.669789931 +0000 UTC m=+1139.550816061" lastFinishedPulling="2026-03-20 13:42:33.21798043 +0000 UTC m=+1168.099006560" observedRunningTime="2026-03-20 13:42:34.099513409 +0000 UTC m=+1168.980539549" watchObservedRunningTime="2026-03-20 13:42:34.101535084 +0000 UTC m=+1168.982561214" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.933027 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.937714 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fee7d83a-7c59-4a95-85b3-8f677f068731-cert\") pod \"infra-operator-controller-manager-669fff9c7c-n45pl\" (UID: \"fee7d83a-7c59-4a95-85b3-8f677f068731\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.965602 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2kpqp" Mar 20 13:42:34 crc kubenswrapper[4856]: I0320 13:42:34.974520 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.240011 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.243952 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/32083d25-90e1-4571-959b-629f6d8393a5-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f5qfxfb\" (UID: \"32083d25-90e1-4571-959b-629f6d8393a5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.420573 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-f99ct" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.429081 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.518727 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl"] Mar 20 13:42:35 crc kubenswrapper[4856]: W0320 13:42:35.529692 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee7d83a_7c59_4a95_85b3_8f677f068731.slice/crio-65a462e697089312550bdce65264e14dc64edabbb6392785e937d782a3b4a150 WatchSource:0}: Error finding container 65a462e697089312550bdce65264e14dc64edabbb6392785e937d782a3b4a150: Status 404 returned error can't find the container with id 65a462e697089312550bdce65264e14dc64edabbb6392785e937d782a3b4a150 Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.750286 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.750571 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.758053 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-metrics-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.773204 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/78560b1b-78fa-4282-a6c3-a06306ab470c-webhook-certs\") pod \"openstack-operator-controller-manager-85d5885774-8ws6b\" (UID: \"78560b1b-78fa-4282-a6c3-a06306ab470c\") " pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.849600 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-c6ff6" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.854458 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:35 crc kubenswrapper[4856]: I0320 13:42:35.904482 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb"] Mar 20 13:42:36 crc kubenswrapper[4856]: W0320 13:42:36.088200 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78560b1b_78fa_4282_a6c3_a06306ab470c.slice/crio-4aebf75d4fc14f919f3696f0d249eaa4b23464055b40b858ada44cd12b8e2889 WatchSource:0}: Error finding container 4aebf75d4fc14f919f3696f0d249eaa4b23464055b40b858ada44cd12b8e2889: Status 404 returned error can't find the container with id 4aebf75d4fc14f919f3696f0d249eaa4b23464055b40b858ada44cd12b8e2889 Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.093013 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b"] Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.195137 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" event={"ID":"fee7d83a-7c59-4a95-85b3-8f677f068731","Type":"ContainerStarted","Data":"65a462e697089312550bdce65264e14dc64edabbb6392785e937d782a3b4a150"} Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.196208 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" event={"ID":"f62801cd-9d41-4312-a337-2e39d0bb1997","Type":"ContainerStarted","Data":"06a4d7a328c615a6c6c83edadbd818d5f6956ca1ca3462cf6b4ce9d51223498c"} Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.197427 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" event={"ID":"78560b1b-78fa-4282-a6c3-a06306ab470c","Type":"ContainerStarted","Data":"4aebf75d4fc14f919f3696f0d249eaa4b23464055b40b858ada44cd12b8e2889"} Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.198635 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" event={"ID":"32083d25-90e1-4571-959b-629f6d8393a5","Type":"ContainerStarted","Data":"f8b0264a8c03cc0adef54995520b507837b4a7899a08033e39df5d3d0175487e"} Mar 20 13:42:36 crc kubenswrapper[4856]: I0320 13:42:36.214185 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" podStartSLOduration=4.460109653 podStartE2EDuration="34.214165501s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.896543011 +0000 UTC m=+1139.777569141" lastFinishedPulling="2026-03-20 13:42:34.650598859 +0000 UTC m=+1169.531624989" observedRunningTime="2026-03-20 13:42:36.212109615 +0000 UTC m=+1171.093135755" watchObservedRunningTime="2026-03-20 13:42:36.214165501 +0000 UTC m=+1171.095191631" Mar 20 13:42:37 crc kubenswrapper[4856]: I0320 13:42:37.207180 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" event={"ID":"78560b1b-78fa-4282-a6c3-a06306ab470c","Type":"ContainerStarted","Data":"de04dcd3d7326985e62836a63662f637cbe0acdc11a5c380e6a5ce019b02ad64"} Mar 20 13:42:37 crc kubenswrapper[4856]: I0320 13:42:37.207287 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:42:37 crc kubenswrapper[4856]: I0320 13:42:37.251200 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" podStartSLOduration=34.251182027 podStartE2EDuration="34.251182027s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:37.244995427 +0000 UTC m=+1172.126021577" watchObservedRunningTime="2026-03-20 13:42:37.251182027 +0000 UTC m=+1172.132208157" Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.222373 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" event={"ID":"32083d25-90e1-4571-959b-629f6d8393a5","Type":"ContainerStarted","Data":"b0bf00208cfe1104bdf52c4c222d222ac1c73be184c064261201820885060ba8"} Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.222970 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.223935 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" event={"ID":"fee7d83a-7c59-4a95-85b3-8f677f068731","Type":"ContainerStarted","Data":"20cf56ea1d2510cac598c65d3c64ca8d79e55977cf20e3fda3cfa5ffbd5ee32b"} Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.224093 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.254307 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" podStartSLOduration=34.492128423 podStartE2EDuration="37.254288426s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:35.915675458 +0000 UTC m=+1170.796701588" lastFinishedPulling="2026-03-20 13:42:38.677835451 +0000 UTC m=+1173.558861591" observedRunningTime="2026-03-20 13:42:39.248459695 +0000 UTC m=+1174.129485845" watchObservedRunningTime="2026-03-20 13:42:39.254288426 +0000 UTC m=+1174.135314566" Mar 20 13:42:39 crc kubenswrapper[4856]: I0320 13:42:39.274298 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" podStartSLOduration=34.149790879 podStartE2EDuration="37.274264012s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:35.533095883 +0000 UTC m=+1170.414122013" lastFinishedPulling="2026-03-20 13:42:38.657569026 +0000 UTC m=+1173.538595146" observedRunningTime="2026-03-20 13:42:39.27307549 +0000 UTC m=+1174.154101670" watchObservedRunningTime="2026-03-20 13:42:39.274264012 +0000 UTC m=+1174.155306063" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.329569 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-4qkfk" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.356458 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-v698k" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.546882 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r6gkh" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.603679 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.605941 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-s2jkg" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.669056 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-fr4gw" Mar 20 13:42:43 crc kubenswrapper[4856]: I0320 13:42:43.692576 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-2b2z6" Mar 20 13:42:44 crc kubenswrapper[4856]: I0320 13:42:44.979435 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-n45pl" Mar 20 13:42:56 crc kubenswrapper[4856]: I0320 13:42:45.434499 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f5qfxfb" Mar 20 13:42:56 crc kubenswrapper[4856]: I0320 13:42:45.861803 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85d5885774-8ws6b" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.967380 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:02 crc kubenswrapper[4856]: E0320 13:43:02.969245 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b72b040-1c32-472d-b5e1-8ee3a7ace646" containerName="oc" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.969381 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b72b040-1c32-472d-b5e1-8ee3a7ace646" containerName="oc" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.969656 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b72b040-1c32-472d-b5e1-8ee3a7ace646" containerName="oc" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.970624 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.973119 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.973185 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bmrbg" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.973118 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.975050 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:43:02 crc kubenswrapper[4856]: I0320 13:43:02.997959 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.000603 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5xt6\" (UniqueName: \"kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.000741 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.056050 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.058751 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.065121 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.072949 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.102472 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.102565 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5xt6\" (UniqueName: \"kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.102610 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhwd7\" (UniqueName: \"kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.102680 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.102748 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.103762 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.133640 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5xt6\" (UniqueName: \"kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6\") pod \"dnsmasq-dns-675f4bcbfc-v5h7t\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.204036 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.204505 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhwd7\" (UniqueName: \"kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.204708 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.204969 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.205715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.222650 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhwd7\" (UniqueName: \"kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7\") pod \"dnsmasq-dns-78dd6ddcc-tfzst\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.301391 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.380233 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.731741 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.737814 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:43:03 crc kubenswrapper[4856]: I0320 13:43:03.815060 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.086743 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.115347 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.116569 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.127552 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.220923 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk5zr\" (UniqueName: \"kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.221357 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.221439 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.322876 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.322975 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk5zr\" (UniqueName: \"kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.323013 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.323875 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.324053 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.344259 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk5zr\" (UniqueName: \"kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr\") pod \"dnsmasq-dns-5ccc8479f9-vzdd5\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.463786 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.475219 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" event={"ID":"7ac7f2bc-ef21-4f0f-a90e-bca60b501292","Type":"ContainerStarted","Data":"87c546951fc590825edb1387f51015da160ec4b17d0005474eedc02ddbf8aa6b"} Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.494816 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" event={"ID":"1213e4bb-eae3-469e-9f60-6ecb48671838","Type":"ContainerStarted","Data":"0b94b96a909a40bbf3a119fbe81ae5a5c0c994417ea6cebac237e1c6c4fcec43"} Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.512959 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.544724 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.546201 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.571398 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.656993 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.657231 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.657304 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrk8\" (UniqueName: \"kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.758655 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.758733 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrk8\" (UniqueName: \"kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.758774 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.759675 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.759840 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.782216 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrk8\" (UniqueName: \"kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8\") pod \"dnsmasq-dns-57d769cc4f-ztjtc\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.919488 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:04 crc kubenswrapper[4856]: I0320 13:43:04.950187 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:04 crc kubenswrapper[4856]: W0320 13:43:04.960323 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21b1b99a_1f3f_4824_a3d8_67f180094b11.slice/crio-250c8af51d15315e63c967b21ee6a8e0d7a22ac4bd9437b9296276167780b9a8 WatchSource:0}: Error finding container 250c8af51d15315e63c967b21ee6a8e0d7a22ac4bd9437b9296276167780b9a8: Status 404 returned error can't find the container with id 250c8af51d15315e63c967b21ee6a8e0d7a22ac4bd9437b9296276167780b9a8 Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.096060 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.097163 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.099467 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.103870 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6c95v" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.104163 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.104419 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.104544 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.104602 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.104763 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.110678 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.163747 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.163806 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164145 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164323 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqxmt\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164401 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164445 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164557 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164647 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164732 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164844 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.164894 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.199311 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:05 crc kubenswrapper[4856]: W0320 13:43:05.214925 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab5db032_d1c7_4491_a098_d526b85fe6ee.slice/crio-f2a0e796e44850abb37da2fb084e0d692ef859c1080f5329bd3af8d15ab97a38 WatchSource:0}: Error finding container f2a0e796e44850abb37da2fb084e0d692ef859c1080f5329bd3af8d15ab97a38: Status 404 returned error can't find the container with id f2a0e796e44850abb37da2fb084e0d692ef859c1080f5329bd3af8d15ab97a38 Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266478 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266538 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266574 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266610 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqxmt\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266639 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266680 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266714 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266748 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266779 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266823 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.266853 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.267585 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.267938 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.268214 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.268739 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.268892 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.268908 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.273434 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.273572 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.273662 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.288682 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqxmt\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.289201 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.299140 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.423148 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.484368 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.486221 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.491453 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.491605 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.491825 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.492026 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.492053 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j8nvl" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.492295 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.492339 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.492872 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.505230 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" event={"ID":"21b1b99a-1f3f-4824-a3d8-67f180094b11","Type":"ContainerStarted","Data":"250c8af51d15315e63c967b21ee6a8e0d7a22ac4bd9437b9296276167780b9a8"} Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.515352 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" event={"ID":"ab5db032-d1c7-4491-a098-d526b85fe6ee","Type":"ContainerStarted","Data":"f2a0e796e44850abb37da2fb084e0d692ef859c1080f5329bd3af8d15ab97a38"} Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571657 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571697 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571724 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571738 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571758 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dj6b\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571906 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571940 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.571995 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.572082 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.572100 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.572136 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.672877 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.672910 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.672935 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dj6b\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.672959 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.672978 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673007 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673047 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673062 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673080 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673121 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.673138 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.674597 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.674754 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.675172 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.675627 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.678305 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.679516 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.688226 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.688474 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.689312 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.690627 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.691706 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dj6b\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.694382 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.824508 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-j8nvl" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.842884 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:43:05 crc kubenswrapper[4856]: I0320 13:43:05.897609 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:05 crc kubenswrapper[4856]: W0320 13:43:05.909230 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a5438ec_0454_4d8e_b356_f9b87b66c2d7.slice/crio-164e3875551f0c6e8999e3b06c1a19b93e4ddd6e550c2b7f47c7041ccdd268ec WatchSource:0}: Error finding container 164e3875551f0c6e8999e3b06c1a19b93e4ddd6e550c2b7f47c7041ccdd268ec: Status 404 returned error can't find the container with id 164e3875551f0c6e8999e3b06c1a19b93e4ddd6e550c2b7f47c7041ccdd268ec Mar 20 13:43:06 crc kubenswrapper[4856]: I0320 13:43:06.238347 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:06 crc kubenswrapper[4856]: W0320 13:43:06.281573 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd2bd8e2_7f52_4c35_ac1d_f1175581a751.slice/crio-15b649f7b1ee27f41c296ceed5d90c6f5d98f67d5a3da929931045f2a11a0752 WatchSource:0}: Error finding container 15b649f7b1ee27f41c296ceed5d90c6f5d98f67d5a3da929931045f2a11a0752: Status 404 returned error can't find the container with id 15b649f7b1ee27f41c296ceed5d90c6f5d98f67d5a3da929931045f2a11a0752 Mar 20 13:43:06 crc kubenswrapper[4856]: I0320 13:43:06.534878 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerStarted","Data":"15b649f7b1ee27f41c296ceed5d90c6f5d98f67d5a3da929931045f2a11a0752"} Mar 20 13:43:06 crc kubenswrapper[4856]: I0320 13:43:06.536669 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerStarted","Data":"164e3875551f0c6e8999e3b06c1a19b93e4ddd6e550c2b7f47c7041ccdd268ec"} Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.074249 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.075854 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.093457 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.093985 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.094846 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-k844m" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.095624 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.096241 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.096926 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112171 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112248 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112289 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112319 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmh7x\" (UniqueName: \"kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112337 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112353 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112446 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.112462 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216745 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216811 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216832 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216861 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmh7x\" (UniqueName: \"kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216882 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216902 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216975 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.216997 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.218447 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.219915 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.222992 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.223163 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.225064 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.226334 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.241603 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.253841 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmh7x\" (UniqueName: \"kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.271457 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.416703 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:43:07 crc kubenswrapper[4856]: I0320 13:43:07.953116 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:07 crc kubenswrapper[4856]: W0320 13:43:07.973460 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4397f29e_c0c9_4726_8fb4_1afe1441ec83.slice/crio-d7e2fd620db7fa74f03dd59aeb1eccd56caeea0f5e61b41c35f96ce465b1e491 WatchSource:0}: Error finding container d7e2fd620db7fa74f03dd59aeb1eccd56caeea0f5e61b41c35f96ce465b1e491: Status 404 returned error can't find the container with id d7e2fd620db7fa74f03dd59aeb1eccd56caeea0f5e61b41c35f96ce465b1e491 Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.375679 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.378142 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.384817 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.385023 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.385149 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.385532 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dr8n4" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.388655 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.543871 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544199 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544257 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544398 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544455 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544559 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544580 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9pwm\" (UniqueName: \"kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.544611 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.577651 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerStarted","Data":"d7e2fd620db7fa74f03dd59aeb1eccd56caeea0f5e61b41c35f96ce465b1e491"} Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647295 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647344 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647392 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647415 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9pwm\" (UniqueName: \"kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647440 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647468 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647499 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.647530 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.648046 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.648810 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.648809 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.649261 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.649498 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.655866 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.658185 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.670005 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9pwm\" (UniqueName: \"kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.703473 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"openstack-cell1-galera-0\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.705542 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.710463 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.714042 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.714042 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-g6v9x" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.714442 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.751018 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4kv\" (UniqueName: \"kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.751103 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.751188 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.751248 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.751314 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.773922 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.852368 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4kv\" (UniqueName: \"kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.852535 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.852685 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.852774 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.852860 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.853644 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.855528 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.863849 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.866685 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:08 crc kubenswrapper[4856]: I0320 13:43:08.875164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4kv\" (UniqueName: \"kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv\") pod \"memcached-0\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " pod="openstack/memcached-0" Mar 20 13:43:09 crc kubenswrapper[4856]: I0320 13:43:09.003528 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:09 crc kubenswrapper[4856]: I0320 13:43:09.055318 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:43:10 crc kubenswrapper[4856]: I0320 13:43:10.837680 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:10 crc kubenswrapper[4856]: I0320 13:43:10.839215 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:43:10 crc kubenswrapper[4856]: I0320 13:43:10.842445 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-5cxcl" Mar 20 13:43:10 crc kubenswrapper[4856]: I0320 13:43:10.855820 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:10 crc kubenswrapper[4856]: I0320 13:43:10.902240 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l24l2\" (UniqueName: \"kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2\") pod \"kube-state-metrics-0\" (UID: \"e0d71c6e-58a3-48cb-8a06-564dafb339d4\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:11 crc kubenswrapper[4856]: I0320 13:43:11.003460 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l24l2\" (UniqueName: \"kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2\") pod \"kube-state-metrics-0\" (UID: \"e0d71c6e-58a3-48cb-8a06-564dafb339d4\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:11 crc kubenswrapper[4856]: I0320 13:43:11.023749 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l24l2\" (UniqueName: \"kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2\") pod \"kube-state-metrics-0\" (UID: \"e0d71c6e-58a3-48cb-8a06-564dafb339d4\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:11 crc kubenswrapper[4856]: I0320 13:43:11.179992 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.145004 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.148207 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.152053 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-56q9l" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.152117 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.152347 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.152400 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.152650 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.156861 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266094 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2f57\" (UniqueName: \"kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266171 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266199 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266260 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266299 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266319 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266344 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.266385 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367766 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367831 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2f57\" (UniqueName: \"kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367861 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367889 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367954 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367971 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.367992 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.368015 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.369162 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.370692 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.370754 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.370978 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.382508 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.390472 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.391589 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.407194 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2f57\" (UniqueName: \"kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.439636 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:14 crc kubenswrapper[4856]: I0320 13:43:14.477232 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.159690 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.160841 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.168540 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.168783 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.169413 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-95zth" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.177767 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.183657 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.183858 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6ft\" (UniqueName: \"kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.183917 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.183945 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.183981 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.184011 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.184040 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.192791 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.194688 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.202135 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284818 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284872 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284914 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284934 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284959 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6ft\" (UniqueName: \"kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.284976 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7wrd\" (UniqueName: \"kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285003 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285028 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285053 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285071 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285116 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285135 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285167 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285737 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285810 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.285833 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.287475 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.290858 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.303676 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.304435 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6ft\" (UniqueName: \"kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft\") pod \"ovn-controller-c697k\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386628 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386700 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386779 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386816 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7wrd\" (UniqueName: \"kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386863 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.386897 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.387136 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.387527 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.387679 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.387745 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.389464 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.411032 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7wrd\" (UniqueName: \"kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd\") pod \"ovn-controller-ovs-qxlnx\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.486866 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k" Mar 20 13:43:15 crc kubenswrapper[4856]: I0320 13:43:15.517799 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.429821 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.432110 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.433676 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.434830 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.436750 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.436821 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-r69zk" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.449237 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534325 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534382 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqmw9\" (UniqueName: \"kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534435 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534479 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534509 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534731 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.534865 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.535037 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.636860 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.636918 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.636940 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.636965 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.636999 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.637051 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.637084 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.637103 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqmw9\" (UniqueName: \"kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.637734 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.638284 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.638396 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.639079 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.642612 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.643085 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.650058 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.665198 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.668713 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqmw9\" (UniqueName: \"kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9\") pod \"ovsdbserver-sb-0\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:18 crc kubenswrapper[4856]: I0320 13:43:18.800131 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:38 crc kubenswrapper[4856]: E0320 13:43:38.588720 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Mar 20 13:43:38 crc kubenswrapper[4856]: E0320 13:43:38.589443 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qmh7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(4397f29e-c0c9-4726-8fb4-1afe1441ec83): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:38 crc kubenswrapper[4856]: E0320 13:43:38.590810 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" Mar 20 13:43:38 crc kubenswrapper[4856]: E0320 13:43:38.821527 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.266349 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.266544 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhwd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tfzst_openstack(7ac7f2bc-ef21-4f0f-a90e-bca60b501292): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.267722 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" podUID="7ac7f2bc-ef21-4f0f-a90e-bca60b501292" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.268051 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.268180 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4vrk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-ztjtc_openstack(ab5db032-d1c7-4491-a098-d526b85fe6ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.269388 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" Mar 20 13:43:39 crc kubenswrapper[4856]: E0320 13:43:39.831138 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.457419 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.457761 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jk5zr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-vzdd5_openstack(21b1b99a-1f3f-4824-a3d8-67f180094b11): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.460887 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" podUID="21b1b99a-1f3f-4824-a3d8-67f180094b11" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.473144 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.486983 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.487638 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5xt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-v5h7t_openstack(1213e4bb-eae3-469e-9f60-6ecb48671838): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.490982 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" podUID="1213e4bb-eae3-469e-9f60-6ecb48671838" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.552569 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config\") pod \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.552688 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhwd7\" (UniqueName: \"kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7\") pod \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.552761 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc\") pod \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\" (UID: \"7ac7f2bc-ef21-4f0f-a90e-bca60b501292\") " Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.553117 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config" (OuterVolumeSpecName: "config") pod "7ac7f2bc-ef21-4f0f-a90e-bca60b501292" (UID: "7ac7f2bc-ef21-4f0f-a90e-bca60b501292"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.553419 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ac7f2bc-ef21-4f0f-a90e-bca60b501292" (UID: "7ac7f2bc-ef21-4f0f-a90e-bca60b501292"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.559404 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7" (OuterVolumeSpecName: "kube-api-access-zhwd7") pod "7ac7f2bc-ef21-4f0f-a90e-bca60b501292" (UID: "7ac7f2bc-ef21-4f0f-a90e-bca60b501292"). InnerVolumeSpecName "kube-api-access-zhwd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.654260 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.654310 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhwd7\" (UniqueName: \"kubernetes.io/projected/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-kube-api-access-zhwd7\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.654322 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ac7f2bc-ef21-4f0f-a90e-bca60b501292-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.832897 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.833320 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tfzst" event={"ID":"7ac7f2bc-ef21-4f0f-a90e-bca60b501292","Type":"ContainerDied","Data":"87c546951fc590825edb1387f51015da160ec4b17d0005474eedc02ddbf8aa6b"} Mar 20 13:43:40 crc kubenswrapper[4856]: E0320 13:43:40.839341 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" podUID="21b1b99a-1f3f-4824-a3d8-67f180094b11" Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.882680 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.919431 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:40 crc kubenswrapper[4856]: I0320 13:43:40.926096 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tfzst"] Mar 20 13:43:41 crc kubenswrapper[4856]: I0320 13:43:41.843701 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ac7f2bc-ef21-4f0f-a90e-bca60b501292" path="/var/lib/kubelet/pods/7ac7f2bc-ef21-4f0f-a90e-bca60b501292/volumes" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.569919 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.570181 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:43:42 crc kubenswrapper[4856]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 13:43:42 crc kubenswrapper[4856]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 13:43:42 crc kubenswrapper[4856]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 13:43:42 crc kubenswrapper[4856]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 13:43:42 crc kubenswrapper[4856]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 13:43:42 crc kubenswrapper[4856]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqxmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(0a5438ec-0454-4d8e-b356-f9b87b66c2d7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 13:43:42 crc kubenswrapper[4856]: > logger="UnhandledError" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.571891 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.573165 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/openstack-rabbitmq:r42p" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.573372 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:43:42 crc kubenswrapper[4856]: init container &Container{Name:setup-container,Image:quay.io/lmiccini/openstack-rabbitmq:r42p,Command:[sh -c],Args:[set -e Mar 20 13:43:42 crc kubenswrapper[4856]: cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie Mar 20 13:43:42 crc kubenswrapper[4856]: chmod 600 /var/lib/rabbitmq/.erlang.cookie Mar 20 13:43:42 crc kubenswrapper[4856]: cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins Mar 20 13:43:42 crc kubenswrapper[4856]: echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf Mar 20 13:43:42 crc kubenswrapper[4856]: # Allow time for multi-pod clusters to complete peer discovery Mar 20 13:43:42 crc kubenswrapper[4856]: sleep 30],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8dj6b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(dd2bd8e2-7f52-4c35-ac1d-f1175581a751): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 20 13:43:42 crc kubenswrapper[4856]: > logger="UnhandledError" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.574973 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.631959 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.682578 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config\") pod \"1213e4bb-eae3-469e-9f60-6ecb48671838\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.682623 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5xt6\" (UniqueName: \"kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6\") pod \"1213e4bb-eae3-469e-9f60-6ecb48671838\" (UID: \"1213e4bb-eae3-469e-9f60-6ecb48671838\") " Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.683296 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config" (OuterVolumeSpecName: "config") pod "1213e4bb-eae3-469e-9f60-6ecb48671838" (UID: "1213e4bb-eae3-469e-9f60-6ecb48671838"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.693084 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6" (OuterVolumeSpecName: "kube-api-access-p5xt6") pod "1213e4bb-eae3-469e-9f60-6ecb48671838" (UID: "1213e4bb-eae3-469e-9f60-6ecb48671838"). InnerVolumeSpecName "kube-api-access-p5xt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.784171 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1213e4bb-eae3-469e-9f60-6ecb48671838-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.784213 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5xt6\" (UniqueName: \"kubernetes.io/projected/1213e4bb-eae3-469e-9f60-6ecb48671838-kube-api-access-p5xt6\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.874710 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerStarted","Data":"85809beb348a1e815d58f20f7a46b185caf09efd3526b86a930c0cb608df5be1"} Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.875803 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" event={"ID":"1213e4bb-eae3-469e-9f60-6ecb48671838","Type":"ContainerDied","Data":"0b94b96a909a40bbf3a119fbe81ae5a5c0c994417ea6cebac237e1c6c4fcec43"} Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.875812 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-v5h7t" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.877397 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-server-0" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" Mar 20 13:43:42 crc kubenswrapper[4856]: E0320 13:43:42.880875 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/openstack-rabbitmq:r42p\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.972741 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:42 crc kubenswrapper[4856]: I0320 13:43:42.979154 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-v5h7t"] Mar 20 13:43:43 crc kubenswrapper[4856]: W0320 13:43:43.018237 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod628ad6bb_ab51_4021_9757_4247a1ccfa71.slice/crio-f7fd88f425e651fc749f100ff75436f9bfbac32dd824126f6ad05384008785de WatchSource:0}: Error finding container f7fd88f425e651fc749f100ff75436f9bfbac32dd824126f6ad05384008785de: Status 404 returned error can't find the container with id f7fd88f425e651fc749f100ff75436f9bfbac32dd824126f6ad05384008785de Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.030244 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.037418 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.145886 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.167470 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:43:43 crc kubenswrapper[4856]: W0320 13:43:43.172531 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822c90b2_be4b_4764_95fb_b0fb02a7a90a.slice/crio-d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce WatchSource:0}: Error finding container d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce: Status 404 returned error can't find the container with id d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce Mar 20 13:43:43 crc kubenswrapper[4856]: W0320 13:43:43.175656 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e2c318a_4df7_4434_8f38_406da145ff89.slice/crio-8697f45617c9efd5e62dc6320f06cd2627911323c81a6628222d5424e26800c6 WatchSource:0}: Error finding container 8697f45617c9efd5e62dc6320f06cd2627911323c81a6628222d5424e26800c6: Status 404 returned error can't find the container with id 8697f45617c9efd5e62dc6320f06cd2627911323c81a6628222d5424e26800c6 Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.244295 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:43 crc kubenswrapper[4856]: W0320 13:43:43.354825 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e5225f1_7607_4e11_904f_0e40e483d384.slice/crio-f6bfdc601a8cd61414d901eb0119efb90ad558eb4c8feea245fd73aee433f4a1 WatchSource:0}: Error finding container f6bfdc601a8cd61414d901eb0119efb90ad558eb4c8feea245fd73aee433f4a1: Status 404 returned error can't find the container with id f6bfdc601a8cd61414d901eb0119efb90ad558eb4c8feea245fd73aee433f4a1 Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.358461 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.828664 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1213e4bb-eae3-469e-9f60-6ecb48671838" path="/var/lib/kubelet/pods/1213e4bb-eae3-469e-9f60-6ecb48671838/volumes" Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.883357 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"628ad6bb-ab51-4021-9757-4247a1ccfa71","Type":"ContainerStarted","Data":"f7fd88f425e651fc749f100ff75436f9bfbac32dd824126f6ad05384008785de"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.884988 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerStarted","Data":"9d385442c55dd9888a0b9258eb36d00bf9d9ad75f41ebd2cb3de9c08afb3b344"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.886031 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k" event={"ID":"5e2c318a-4df7-4434-8f38-406da145ff89","Type":"ContainerStarted","Data":"8697f45617c9efd5e62dc6320f06cd2627911323c81a6628222d5424e26800c6"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.887709 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerStarted","Data":"f6bfdc601a8cd61414d901eb0119efb90ad558eb4c8feea245fd73aee433f4a1"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.888940 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0d71c6e-58a3-48cb-8a06-564dafb339d4","Type":"ContainerStarted","Data":"f778ece17ed602f614d0484d671f0c237a94e47d5c3633a652924b63b46653d6"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.890711 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerStarted","Data":"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0"} Mar 20 13:43:43 crc kubenswrapper[4856]: I0320 13:43:43.891724 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerStarted","Data":"d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce"} Mar 20 13:43:45 crc kubenswrapper[4856]: I0320 13:43:45.921510 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"628ad6bb-ab51-4021-9757-4247a1ccfa71","Type":"ContainerStarted","Data":"064628587a2ade55cadbdf45c06311c4f3381732bc6b09b5dbd0ce453ed18ef9"} Mar 20 13:43:45 crc kubenswrapper[4856]: I0320 13:43:45.922005 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:43:45 crc kubenswrapper[4856]: I0320 13:43:45.961831 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=35.731414763 podStartE2EDuration="37.961796514s" podCreationTimestamp="2026-03-20 13:43:08 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.02208609 +0000 UTC m=+1237.903112240" lastFinishedPulling="2026-03-20 13:43:45.252467861 +0000 UTC m=+1240.133493991" observedRunningTime="2026-03-20 13:43:45.954468744 +0000 UTC m=+1240.835494894" watchObservedRunningTime="2026-03-20 13:43:45.961796514 +0000 UTC m=+1240.842822644" Mar 20 13:43:47 crc kubenswrapper[4856]: I0320 13:43:47.941191 4856 generic.go:334] "Generic (PLEG): container finished" podID="64ade26b-4889-4021-b876-1fdbdb077c26" containerID="067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0" exitCode=0 Mar 20 13:43:47 crc kubenswrapper[4856]: I0320 13:43:47.941233 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerDied","Data":"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0"} Mar 20 13:43:48 crc kubenswrapper[4856]: I0320 13:43:48.973732 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerStarted","Data":"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c"} Mar 20 13:43:48 crc kubenswrapper[4856]: I0320 13:43:48.996957 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=41.400019017 podStartE2EDuration="41.996942372s" podCreationTimestamp="2026-03-20 13:43:07 +0000 UTC" firstStartedPulling="2026-03-20 13:43:42.585812914 +0000 UTC m=+1237.466839064" lastFinishedPulling="2026-03-20 13:43:43.182736289 +0000 UTC m=+1238.063762419" observedRunningTime="2026-03-20 13:43:48.995388128 +0000 UTC m=+1243.876414278" watchObservedRunningTime="2026-03-20 13:43:48.996942372 +0000 UTC m=+1243.877968502" Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.008530 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.008586 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.984067 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k" event={"ID":"5e2c318a-4df7-4434-8f38-406da145ff89","Type":"ContainerStarted","Data":"a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af"} Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.984583 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-c697k" Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.987562 4856 generic.go:334] "Generic (PLEG): container finished" podID="8e5225f1-7607-4e11-904f-0e40e483d384" containerID="7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c" exitCode=0 Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.987593 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerDied","Data":"7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c"} Mar 20 13:43:49 crc kubenswrapper[4856]: I0320 13:43:49.993046 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerStarted","Data":"e74e7181e50b8482827268fba5857dee7206b3d88d8299a92ee57e8febb4c398"} Mar 20 13:43:50 crc kubenswrapper[4856]: I0320 13:43:49.999420 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerStarted","Data":"e82da2e027c260fd3e464cbc4c2865412cd62f8b83fa8ce9156736fd50719665"} Mar 20 13:43:50 crc kubenswrapper[4856]: I0320 13:43:50.009004 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c697k" podStartSLOduration=29.813758378 podStartE2EDuration="35.008981792s" podCreationTimestamp="2026-03-20 13:43:15 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.179532401 +0000 UTC m=+1238.060558531" lastFinishedPulling="2026-03-20 13:43:48.374755795 +0000 UTC m=+1243.255781945" observedRunningTime="2026-03-20 13:43:50.008715465 +0000 UTC m=+1244.889741595" watchObservedRunningTime="2026-03-20 13:43:50.008981792 +0000 UTC m=+1244.890007942" Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.007393 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0d71c6e-58a3-48cb-8a06-564dafb339d4","Type":"ContainerStarted","Data":"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55"} Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.007989 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.010229 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerStarted","Data":"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb"} Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.010304 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.010315 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerStarted","Data":"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a"} Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.010325 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.026182 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=34.31492987 podStartE2EDuration="41.026167665s" podCreationTimestamp="2026-03-20 13:43:10 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.049227173 +0000 UTC m=+1237.930253303" lastFinishedPulling="2026-03-20 13:43:49.760464948 +0000 UTC m=+1244.641491098" observedRunningTime="2026-03-20 13:43:51.022185426 +0000 UTC m=+1245.903211556" watchObservedRunningTime="2026-03-20 13:43:51.026167665 +0000 UTC m=+1245.907193795" Mar 20 13:43:51 crc kubenswrapper[4856]: I0320 13:43:51.052611 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qxlnx" podStartSLOduration=31.128596312 podStartE2EDuration="36.052593989s" podCreationTimestamp="2026-03-20 13:43:15 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.356008004 +0000 UTC m=+1238.237034134" lastFinishedPulling="2026-03-20 13:43:48.280005671 +0000 UTC m=+1243.161031811" observedRunningTime="2026-03-20 13:43:51.045289829 +0000 UTC m=+1245.926315979" watchObservedRunningTime="2026-03-20 13:43:51.052593989 +0000 UTC m=+1245.933620119" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.045399 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerStarted","Data":"97772f6ee946ca97d67b8f71fc7f1f0295d3e5e51873e5b35681e4f70ee23a04"} Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.050491 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerStarted","Data":"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d"} Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.051633 4856 generic.go:334] "Generic (PLEG): container finished" podID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerID="7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d" exitCode=0 Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.051781 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" event={"ID":"ab5db032-d1c7-4491-a098-d526b85fe6ee","Type":"ContainerDied","Data":"7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d"} Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.058081 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.058475 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerStarted","Data":"11061327d008cbbe148b32ba2127eb9d46fde32a3092f69a96fe49fa415abcd1"} Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.090169 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=26.976293048 podStartE2EDuration="37.090151072s" podCreationTimestamp="2026-03-20 13:43:17 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.259684386 +0000 UTC m=+1238.140710516" lastFinishedPulling="2026-03-20 13:43:53.37354241 +0000 UTC m=+1248.254568540" observedRunningTime="2026-03-20 13:43:54.084741524 +0000 UTC m=+1248.965767684" watchObservedRunningTime="2026-03-20 13:43:54.090151072 +0000 UTC m=+1248.971177212" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.160513 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=30.970265682 podStartE2EDuration="41.160491258s" podCreationTimestamp="2026-03-20 13:43:13 +0000 UTC" firstStartedPulling="2026-03-20 13:43:43.179577682 +0000 UTC m=+1238.060603812" lastFinishedPulling="2026-03-20 13:43:53.369803258 +0000 UTC m=+1248.250829388" observedRunningTime="2026-03-20 13:43:54.13208119 +0000 UTC m=+1249.013107380" watchObservedRunningTime="2026-03-20 13:43:54.160491258 +0000 UTC m=+1249.041517388" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.478326 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.801230 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:54 crc kubenswrapper[4856]: I0320 13:43:54.859602 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.068799 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" event={"ID":"ab5db032-d1c7-4491-a098-d526b85fe6ee","Type":"ContainerStarted","Data":"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba"} Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.069194 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.069509 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.095933 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" podStartSLOduration=2.939993115 podStartE2EDuration="51.095915082s" podCreationTimestamp="2026-03-20 13:43:04 +0000 UTC" firstStartedPulling="2026-03-20 13:43:05.217050768 +0000 UTC m=+1200.098076898" lastFinishedPulling="2026-03-20 13:43:53.372972725 +0000 UTC m=+1248.253998865" observedRunningTime="2026-03-20 13:43:55.089140596 +0000 UTC m=+1249.970166736" watchObservedRunningTime="2026-03-20 13:43:55.095915082 +0000 UTC m=+1249.976941222" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.121279 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.216472 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.303386 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.395938 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.431918 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.433214 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.440558 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.442738 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.489436 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.490560 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.492645 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.512532 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531217 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531284 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531387 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531444 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5jlx\" (UniqueName: \"kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531543 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531588 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531737 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531786 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531818 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfrf2\" (UniqueName: \"kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.531886 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.633855 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.633907 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.633947 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfrf2\" (UniqueName: \"kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.633984 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634034 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634059 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634079 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634103 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5jlx\" (UniqueName: \"kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634181 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634227 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.634705 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.635418 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.635483 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.635729 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.636062 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.636817 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.637541 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.638655 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.652055 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfrf2\" (UniqueName: \"kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2\") pod \"ovn-controller-metrics-m72lx\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.655769 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5jlx\" (UniqueName: \"kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx\") pod \"dnsmasq-dns-7f896c8c65-84b7b\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.734474 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.762318 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.762937 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.764707 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.769434 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.783973 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.821620 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.836570 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.836628 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.836668 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566qd\" (UniqueName: \"kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.836691 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.836751 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.938167 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.938227 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566qd\" (UniqueName: \"kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.938251 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.938331 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.938367 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.939240 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.939510 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.939732 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.939898 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:55 crc kubenswrapper[4856]: I0320 13:43:55.955315 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566qd\" (UniqueName: \"kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd\") pod \"dnsmasq-dns-86db49b7ff-fnsbs\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.232408 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.253855 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.345747 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk5zr\" (UniqueName: \"kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr\") pod \"21b1b99a-1f3f-4824-a3d8-67f180094b11\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.345836 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc\") pod \"21b1b99a-1f3f-4824-a3d8-67f180094b11\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.345873 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config\") pod \"21b1b99a-1f3f-4824-a3d8-67f180094b11\" (UID: \"21b1b99a-1f3f-4824-a3d8-67f180094b11\") " Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.346503 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "21b1b99a-1f3f-4824-a3d8-67f180094b11" (UID: "21b1b99a-1f3f-4824-a3d8-67f180094b11"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.346623 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config" (OuterVolumeSpecName: "config") pod "21b1b99a-1f3f-4824-a3d8-67f180094b11" (UID: "21b1b99a-1f3f-4824-a3d8-67f180094b11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.365372 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.383885 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.440805 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr" (OuterVolumeSpecName: "kube-api-access-jk5zr") pod "21b1b99a-1f3f-4824-a3d8-67f180094b11" (UID: "21b1b99a-1f3f-4824-a3d8-67f180094b11"). InnerVolumeSpecName "kube-api-access-jk5zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:56 crc kubenswrapper[4856]: W0320 13:43:56.445747 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c87ef15_13a3_4043_8dc0_55e6636deeef.slice/crio-d680d9b3bba2028522025eff271ac589826875dc70215356fdfbc3e371f51c5e WatchSource:0}: Error finding container d680d9b3bba2028522025eff271ac589826875dc70215356fdfbc3e371f51c5e: Status 404 returned error can't find the container with id d680d9b3bba2028522025eff271ac589826875dc70215356fdfbc3e371f51c5e Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.447288 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.447312 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21b1b99a-1f3f-4824-a3d8-67f180094b11-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.447325 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk5zr\" (UniqueName: \"kubernetes.io/projected/21b1b99a-1f3f-4824-a3d8-67f180094b11-kube-api-access-jk5zr\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:56 crc kubenswrapper[4856]: W0320 13:43:56.447650 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb213c60_487b_4248_bf86_ed69e2fac5e1.slice/crio-401ad87433ebce0dd685dab4fde17359315208139163df09b85629d7729cf06c WatchSource:0}: Error finding container 401ad87433ebce0dd685dab4fde17359315208139163df09b85629d7729cf06c: Status 404 returned error can't find the container with id 401ad87433ebce0dd685dab4fde17359315208139163df09b85629d7729cf06c Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.478542 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.513553 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:56 crc kubenswrapper[4856]: I0320 13:43:56.690584 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.114530 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" event={"ID":"2dad0b1a-c1e1-47b1-b701-85a690f10481","Type":"ContainerStarted","Data":"f5cad51a613774c6dde89d5faea37ea7e3c1feff7d56b50e90a84a4ab7594424"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.121485 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerStarted","Data":"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.125574 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m72lx" event={"ID":"fb213c60-487b-4248-bf86-ed69e2fac5e1","Type":"ContainerStarted","Data":"13fbe8a6214cf53a695ee8776babc39be8ea87b89ebbab5b990a58da11d28f97"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.125613 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m72lx" event={"ID":"fb213c60-487b-4248-bf86-ed69e2fac5e1","Type":"ContainerStarted","Data":"401ad87433ebce0dd685dab4fde17359315208139163df09b85629d7729cf06c"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.127490 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerStarted","Data":"5ab9a98d399750f4cde1f783ae91e1d160bff7a9ccceea461dbfd346e6e256f6"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.133362 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerStarted","Data":"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.133396 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerStarted","Data":"d680d9b3bba2028522025eff271ac589826875dc70215356fdfbc3e371f51c5e"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.134958 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" event={"ID":"21b1b99a-1f3f-4824-a3d8-67f180094b11","Type":"ContainerDied","Data":"250c8af51d15315e63c967b21ee6a8e0d7a22ac4bd9437b9296276167780b9a8"} Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.135012 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-vzdd5" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.136329 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="dnsmasq-dns" containerID="cri-o://1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba" gracePeriod=10 Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.181326 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.250403 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-m72lx" podStartSLOduration=2.250380874 podStartE2EDuration="2.250380874s" podCreationTimestamp="2026-03-20 13:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:57.212142238 +0000 UTC m=+1252.093168378" watchObservedRunningTime="2026-03-20 13:43:57.250380874 +0000 UTC m=+1252.131407004" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.423209 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gbjqm"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.424506 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.426611 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.433620 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gbjqm"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.471633 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.481397 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-vzdd5"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.510940 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.511253 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsc6v\" (UniqueName: \"kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.566910 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.583493 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.595048 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.595427 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.595590 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.595695 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6n4hr" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.601738 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.619754 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhr6h\" (UniqueName: \"kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.619812 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.619861 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.619945 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.620011 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.620041 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.620068 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.620102 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsc6v\" (UniqueName: \"kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.620134 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.631645 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.639799 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsc6v\" (UniqueName: \"kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v\") pod \"root-account-create-update-gbjqm\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.716184 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.720997 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config\") pod \"ab5db032-d1c7-4491-a098-d526b85fe6ee\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721031 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc\") pod \"ab5db032-d1c7-4491-a098-d526b85fe6ee\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721053 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vrk8\" (UniqueName: \"kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8\") pod \"ab5db032-d1c7-4491-a098-d526b85fe6ee\" (UID: \"ab5db032-d1c7-4491-a098-d526b85fe6ee\") " Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721180 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721217 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721261 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721309 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhr6h\" (UniqueName: \"kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721327 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721357 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721404 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.721598 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.722784 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.723006 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.725977 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.727477 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8" (OuterVolumeSpecName: "kube-api-access-4vrk8") pod "ab5db032-d1c7-4491-a098-d526b85fe6ee" (UID: "ab5db032-d1c7-4491-a098-d526b85fe6ee"). InnerVolumeSpecName "kube-api-access-4vrk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.737885 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.738655 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.744157 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhr6h\" (UniqueName: \"kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h\") pod \"ovn-northd-0\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " pod="openstack/ovn-northd-0" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.747726 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbjqm" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.769435 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab5db032-d1c7-4491-a098-d526b85fe6ee" (UID: "ab5db032-d1c7-4491-a098-d526b85fe6ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.779483 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config" (OuterVolumeSpecName: "config") pod "ab5db032-d1c7-4491-a098-d526b85fe6ee" (UID: "ab5db032-d1c7-4491-a098-d526b85fe6ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.823317 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.823580 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab5db032-d1c7-4491-a098-d526b85fe6ee-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.823595 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vrk8\" (UniqueName: \"kubernetes.io/projected/ab5db032-d1c7-4491-a098-d526b85fe6ee-kube-api-access-4vrk8\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.842413 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b1b99a-1f3f-4824-a3d8-67f180094b11" path="/var/lib/kubelet/pods/21b1b99a-1f3f-4824-a3d8-67f180094b11/volumes" Mar 20 13:43:57 crc kubenswrapper[4856]: I0320 13:43:57.930746 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.152800 4856 generic.go:334] "Generic (PLEG): container finished" podID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerID="82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2" exitCode=0 Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.152912 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerDied","Data":"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2"} Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.154973 4856 generic.go:334] "Generic (PLEG): container finished" podID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerID="111338fccc15513427c7a63513a3b2bd06b1200efc391692ba800b94dd3cac6c" exitCode=0 Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.155101 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" event={"ID":"2dad0b1a-c1e1-47b1-b701-85a690f10481","Type":"ContainerDied","Data":"111338fccc15513427c7a63513a3b2bd06b1200efc391692ba800b94dd3cac6c"} Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.155982 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gbjqm"] Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.157079 4856 generic.go:334] "Generic (PLEG): container finished" podID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerID="1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba" exitCode=0 Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.157123 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" event={"ID":"ab5db032-d1c7-4491-a098-d526b85fe6ee","Type":"ContainerDied","Data":"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba"} Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.157157 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.157186 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-ztjtc" event={"ID":"ab5db032-d1c7-4491-a098-d526b85fe6ee","Type":"ContainerDied","Data":"f2a0e796e44850abb37da2fb084e0d692ef859c1080f5329bd3af8d15ab97a38"} Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.157202 4856 scope.go:117] "RemoveContainer" containerID="1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.184835 4856 scope.go:117] "RemoveContainer" containerID="7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.197822 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.217257 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-ztjtc"] Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.241314 4856 scope.go:117] "RemoveContainer" containerID="1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba" Mar 20 13:43:58 crc kubenswrapper[4856]: E0320 13:43:58.243726 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba\": container with ID starting with 1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba not found: ID does not exist" containerID="1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.243762 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba"} err="failed to get container status \"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba\": rpc error: code = NotFound desc = could not find container \"1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba\": container with ID starting with 1a817c52dd44041e6eb7f293e87a9f3c92599df9ae9cf51987d1cdb84856b7ba not found: ID does not exist" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.243787 4856 scope.go:117] "RemoveContainer" containerID="7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d" Mar 20 13:43:58 crc kubenswrapper[4856]: E0320 13:43:58.246532 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d\": container with ID starting with 7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d not found: ID does not exist" containerID="7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.246579 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d"} err="failed to get container status \"7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d\": rpc error: code = NotFound desc = could not find container \"7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d\": container with ID starting with 7a54b4baf544b326064148ceccb72fd134c65e5cd1dc6608579d6b866e3a7e5d not found: ID does not exist" Mar 20 13:43:58 crc kubenswrapper[4856]: I0320 13:43:58.343067 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:43:58 crc kubenswrapper[4856]: W0320 13:43:58.356954 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23fc74c5_121e_4ac1_8d50_8be3393d080a.slice/crio-5e13e39e09955b323bc7368d5da64b824f5d4c3bafd101b5ad4da1eb8c2c51c1 WatchSource:0}: Error finding container 5e13e39e09955b323bc7368d5da64b824f5d4c3bafd101b5ad4da1eb8c2c51c1: Status 404 returned error can't find the container with id 5e13e39e09955b323bc7368d5da64b824f5d4c3bafd101b5ad4da1eb8c2c51c1 Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.168738 4856 generic.go:334] "Generic (PLEG): container finished" podID="40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" containerID="fdbae009f5ef9dcbbfa96f14f7c8dd2d9accee678a6a94ed47f4995dd338234c" exitCode=0 Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.168975 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbjqm" event={"ID":"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da","Type":"ContainerDied","Data":"fdbae009f5ef9dcbbfa96f14f7c8dd2d9accee678a6a94ed47f4995dd338234c"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.169810 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbjqm" event={"ID":"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da","Type":"ContainerStarted","Data":"dc26bc13a893f66eecb69e2db87d2e97a6c0145026940403c81a8f93f9d90896"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.172158 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerStarted","Data":"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.172735 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.174691 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" event={"ID":"2dad0b1a-c1e1-47b1-b701-85a690f10481","Type":"ContainerStarted","Data":"1da59b2bf0b7fc2f6334ec02d04bce04abeaed57e52e5607324784c41c618ed5"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.174818 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.175883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerStarted","Data":"5e13e39e09955b323bc7368d5da64b824f5d4c3bafd101b5ad4da1eb8c2c51c1"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.176961 4856 generic.go:334] "Generic (PLEG): container finished" podID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerID="d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d" exitCode=0 Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.177502 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerDied","Data":"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d"} Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.203117 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" podStartSLOduration=4.203097783 podStartE2EDuration="4.203097783s" podCreationTimestamp="2026-03-20 13:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:59.198880408 +0000 UTC m=+1254.079906618" watchObservedRunningTime="2026-03-20 13:43:59.203097783 +0000 UTC m=+1254.084123913" Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.240756 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" podStartSLOduration=4.240733774 podStartE2EDuration="4.240733774s" podCreationTimestamp="2026-03-20 13:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:59.23295047 +0000 UTC m=+1254.113976620" watchObservedRunningTime="2026-03-20 13:43:59.240733774 +0000 UTC m=+1254.121759904" Mar 20 13:43:59 crc kubenswrapper[4856]: I0320 13:43:59.828872 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" path="/var/lib/kubelet/pods/ab5db032-d1c7-4491-a098-d526b85fe6ee/volumes" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.137341 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-nhn9w"] Mar 20 13:44:00 crc kubenswrapper[4856]: E0320 13:44:00.137724 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="init" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.137745 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="init" Mar 20 13:44:00 crc kubenswrapper[4856]: E0320 13:44:00.137775 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="dnsmasq-dns" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.137783 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="dnsmasq-dns" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.137984 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab5db032-d1c7-4491-a098-d526b85fe6ee" containerName="dnsmasq-dns" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.138662 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.144654 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.144767 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.144910 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.145779 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-nhn9w"] Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.184437 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerStarted","Data":"00f17ef2ce0ea5d744a6efd0758584d67cfece02bc7267d434c74cf910e4020f"} Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.184492 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerStarted","Data":"9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69"} Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.184575 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.187200 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerStarted","Data":"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092"} Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.204964 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.195784272 podStartE2EDuration="3.204944445s" podCreationTimestamp="2026-03-20 13:43:57 +0000 UTC" firstStartedPulling="2026-03-20 13:43:58.361954381 +0000 UTC m=+1253.242980511" lastFinishedPulling="2026-03-20 13:43:59.371114564 +0000 UTC m=+1254.252140684" observedRunningTime="2026-03-20 13:44:00.204558385 +0000 UTC m=+1255.085584525" watchObservedRunningTime="2026-03-20 13:44:00.204944445 +0000 UTC m=+1255.085970575" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.268662 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjdh\" (UniqueName: \"kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh\") pod \"auto-csr-approver-29566904-nhn9w\" (UID: \"3e26cfc8-a8ce-4db8-bc87-f57d1d209731\") " pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.370988 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjdh\" (UniqueName: \"kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh\") pod \"auto-csr-approver-29566904-nhn9w\" (UID: \"3e26cfc8-a8ce-4db8-bc87-f57d1d209731\") " pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.394862 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjdh\" (UniqueName: \"kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh\") pod \"auto-csr-approver-29566904-nhn9w\" (UID: \"3e26cfc8-a8ce-4db8-bc87-f57d1d209731\") " pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.465207 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.474295 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbjqm" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.493792 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371982.361 podStartE2EDuration="54.493775794s" podCreationTimestamp="2026-03-20 13:43:06 +0000 UTC" firstStartedPulling="2026-03-20 13:43:07.983729125 +0000 UTC m=+1202.864755255" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:00.224008817 +0000 UTC m=+1255.105034947" watchObservedRunningTime="2026-03-20 13:44:00.493775794 +0000 UTC m=+1255.374801924" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.574530 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsc6v\" (UniqueName: \"kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v\") pod \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.574640 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts\") pod \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\" (UID: \"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da\") " Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.575757 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" (UID: "40f1e6b2-2fc6-4676-8f63-3c9ee5d097da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.579201 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v" (OuterVolumeSpecName: "kube-api-access-bsc6v") pod "40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" (UID: "40f1e6b2-2fc6-4676-8f63-3c9ee5d097da"). InnerVolumeSpecName "kube-api-access-bsc6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.676940 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsc6v\" (UniqueName: \"kubernetes.io/projected/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-kube-api-access-bsc6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.677186 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:00 crc kubenswrapper[4856]: I0320 13:44:00.915479 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-nhn9w"] Mar 20 13:44:00 crc kubenswrapper[4856]: W0320 13:44:00.923510 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e26cfc8_a8ce_4db8_bc87_f57d1d209731.slice/crio-446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a WatchSource:0}: Error finding container 446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a: Status 404 returned error can't find the container with id 446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.194554 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" event={"ID":"3e26cfc8-a8ce-4db8-bc87-f57d1d209731","Type":"ContainerStarted","Data":"446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a"} Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.194835 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.201707 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="dnsmasq-dns" containerID="cri-o://42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba" gracePeriod=10 Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.202155 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gbjqm" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.210753 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gbjqm" event={"ID":"40f1e6b2-2fc6-4676-8f63-3c9ee5d097da","Type":"ContainerDied","Data":"dc26bc13a893f66eecb69e2db87d2e97a6c0145026940403c81a8f93f9d90896"} Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.211007 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc26bc13a893f66eecb69e2db87d2e97a6c0145026940403c81a8f93f9d90896" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.217015 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.247080 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:44:01 crc kubenswrapper[4856]: E0320 13:44:01.247488 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" containerName="mariadb-account-create-update" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.247505 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" containerName="mariadb-account-create-update" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.247687 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" containerName="mariadb-account-create-update" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.248530 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.262979 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.391201 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.391332 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chgq\" (UniqueName: \"kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.391364 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.391397 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.391587 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.496179 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.496260 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9chgq\" (UniqueName: \"kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.496310 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.496341 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.496387 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.497471 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.498395 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.498401 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.499023 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.518606 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chgq\" (UniqueName: \"kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq\") pod \"dnsmasq-dns-698758b865-nvzdt\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.562625 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.637940 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.800147 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5jlx\" (UniqueName: \"kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx\") pod \"3c87ef15-13a3-4043-8dc0-55e6636deeef\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.800349 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config\") pod \"3c87ef15-13a3-4043-8dc0-55e6636deeef\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.800386 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc\") pod \"3c87ef15-13a3-4043-8dc0-55e6636deeef\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.800433 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb\") pod \"3c87ef15-13a3-4043-8dc0-55e6636deeef\" (UID: \"3c87ef15-13a3-4043-8dc0-55e6636deeef\") " Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.827434 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx" (OuterVolumeSpecName: "kube-api-access-n5jlx") pod "3c87ef15-13a3-4043-8dc0-55e6636deeef" (UID: "3c87ef15-13a3-4043-8dc0-55e6636deeef"). InnerVolumeSpecName "kube-api-access-n5jlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.850046 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config" (OuterVolumeSpecName: "config") pod "3c87ef15-13a3-4043-8dc0-55e6636deeef" (UID: "3c87ef15-13a3-4043-8dc0-55e6636deeef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.859964 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c87ef15-13a3-4043-8dc0-55e6636deeef" (UID: "3c87ef15-13a3-4043-8dc0-55e6636deeef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.863834 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c87ef15-13a3-4043-8dc0-55e6636deeef" (UID: "3c87ef15-13a3-4043-8dc0-55e6636deeef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.902071 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.902109 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5jlx\" (UniqueName: \"kubernetes.io/projected/3c87ef15-13a3-4043-8dc0-55e6636deeef-kube-api-access-n5jlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.902121 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:01 crc kubenswrapper[4856]: I0320 13:44:01.902131 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c87ef15-13a3-4043-8dc0-55e6636deeef-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.056924 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.211670 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-nvzdt" event={"ID":"b55fcdcb-f03a-448e-9fc1-a8d04504b935","Type":"ContainerStarted","Data":"de5a575a0778df092f923395ccba36e60bcfd6422a8f6d823a033760878d1a56"} Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.213687 4856 generic.go:334] "Generic (PLEG): container finished" podID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerID="42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba" exitCode=0 Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.213725 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerDied","Data":"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba"} Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.213747 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" event={"ID":"3c87ef15-13a3-4043-8dc0-55e6636deeef","Type":"ContainerDied","Data":"d680d9b3bba2028522025eff271ac589826875dc70215356fdfbc3e371f51c5e"} Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.213765 4856 scope.go:117] "RemoveContainer" containerID="42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.213901 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-84b7b" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.257046 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.259549 4856 scope.go:117] "RemoveContainer" containerID="82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.260819 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-84b7b"] Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.287440 4856 scope.go:117] "RemoveContainer" containerID="42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba" Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.287841 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba\": container with ID starting with 42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba not found: ID does not exist" containerID="42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.287886 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba"} err="failed to get container status \"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba\": rpc error: code = NotFound desc = could not find container \"42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba\": container with ID starting with 42894e68f8f9997314223683ab21c5a17c1c6a35263e144617def73a26105dba not found: ID does not exist" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.287912 4856 scope.go:117] "RemoveContainer" containerID="82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2" Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.288194 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2\": container with ID starting with 82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2 not found: ID does not exist" containerID="82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.288229 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2"} err="failed to get container status \"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2\": rpc error: code = NotFound desc = could not find container \"82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2\": container with ID starting with 82d5cc6bee50f6626c1fe9aac0cf2c36bb5f1aa382d4b3a6b5f455b522f19fe2 not found: ID does not exist" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.308097 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.308425 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="dnsmasq-dns" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.308439 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="dnsmasq-dns" Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.308456 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="init" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.308461 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="init" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.308622 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" containerName="dnsmasq-dns" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.313404 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.321407 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.321637 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.322261 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.322664 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-l9pns" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.334071 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412155 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412206 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412255 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412377 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mll\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412404 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.412440 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514419 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514503 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mll\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514531 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514569 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514658 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.514679 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.515032 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.515058 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.515098 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: E0320 13:44:02.515125 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:03.015103291 +0000 UTC m=+1257.896129421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.515232 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.515301 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.519330 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.532745 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mll\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.535199 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.966799 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-6hmzq"] Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.969360 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.972040 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.972312 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.972813 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:44:02 crc kubenswrapper[4856]: I0320 13:44:02.975725 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6hmzq"] Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.022585 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:03 crc kubenswrapper[4856]: E0320 13:44:03.022776 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:03 crc kubenswrapper[4856]: E0320 13:44:03.022809 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:03 crc kubenswrapper[4856]: E0320 13:44:03.022878 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:04.022857315 +0000 UTC m=+1258.903883455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124005 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124058 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7l97\" (UniqueName: \"kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124096 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124168 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124197 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124261 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.124330 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225427 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225584 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225635 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225725 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225791 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225868 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.225902 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7l97\" (UniqueName: \"kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.226569 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.226638 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e26cfc8-a8ce-4db8-bc87-f57d1d209731" containerID="00a8dfecbc0a5d7ce0a19c67a349d5cccda83e9e9fee4ec3f2df1ac74770930a" exitCode=0 Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.226733 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" event={"ID":"3e26cfc8-a8ce-4db8-bc87-f57d1d209731","Type":"ContainerDied","Data":"00a8dfecbc0a5d7ce0a19c67a349d5cccda83e9e9fee4ec3f2df1ac74770930a"} Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.227878 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.228180 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.231618 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.233853 4856 generic.go:334] "Generic (PLEG): container finished" podID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerID="371ed1a7c848633219870522eddd5617212429dd771030b4f04bbf70e1fd6de0" exitCode=0 Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.233906 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-nvzdt" event={"ID":"b55fcdcb-f03a-448e-9fc1-a8d04504b935","Type":"ContainerDied","Data":"371ed1a7c848633219870522eddd5617212429dd771030b4f04bbf70e1fd6de0"} Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.239459 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.249567 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.252071 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7l97\" (UniqueName: \"kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97\") pod \"swift-ring-rebalance-6hmzq\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.290563 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.780062 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-6hmzq"] Mar 20 13:44:03 crc kubenswrapper[4856]: I0320 13:44:03.834410 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c87ef15-13a3-4043-8dc0-55e6636deeef" path="/var/lib/kubelet/pods/3c87ef15-13a3-4043-8dc0-55e6636deeef/volumes" Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.049736 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:04 crc kubenswrapper[4856]: E0320 13:44:04.049989 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:04 crc kubenswrapper[4856]: E0320 13:44:04.050601 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:04 crc kubenswrapper[4856]: E0320 13:44:04.050660 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:06.050634797 +0000 UTC m=+1260.931660927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.243653 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hmzq" event={"ID":"a0094c63-b84f-4a1c-839b-47da04da9efb","Type":"ContainerStarted","Data":"90fd1ff6b40a1a17d3768a310806201b51880c924814918d5183b6a7e8964881"} Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.245945 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-nvzdt" event={"ID":"b55fcdcb-f03a-448e-9fc1-a8d04504b935","Type":"ContainerStarted","Data":"8b9eb612881133f8106b807c3fc767f1a342062e5330fcf41145845996fb5dd0"} Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.267629 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-nvzdt" podStartSLOduration=3.267610778 podStartE2EDuration="3.267610778s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:04.264439772 +0000 UTC m=+1259.145465922" watchObservedRunningTime="2026-03-20 13:44:04.267610778 +0000 UTC m=+1259.148636908" Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.589027 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.761899 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvjdh\" (UniqueName: \"kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh\") pod \"3e26cfc8-a8ce-4db8-bc87-f57d1d209731\" (UID: \"3e26cfc8-a8ce-4db8-bc87-f57d1d209731\") " Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.769605 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh" (OuterVolumeSpecName: "kube-api-access-jvjdh") pod "3e26cfc8-a8ce-4db8-bc87-f57d1d209731" (UID: "3e26cfc8-a8ce-4db8-bc87-f57d1d209731"). InnerVolumeSpecName "kube-api-access-jvjdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:04 crc kubenswrapper[4856]: I0320 13:44:04.865221 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvjdh\" (UniqueName: \"kubernetes.io/projected/3e26cfc8-a8ce-4db8-bc87-f57d1d209731-kube-api-access-jvjdh\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.257632 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.257626 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-nhn9w" event={"ID":"3e26cfc8-a8ce-4db8-bc87-f57d1d209731","Type":"ContainerDied","Data":"446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a"} Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.257721 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="446237201b183a399705762e04b8adf244e0b7341952d76a523b619b9432b50a" Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.258188 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.656345 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-226wk"] Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.661669 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-226wk"] Mar 20 13:44:05 crc kubenswrapper[4856]: I0320 13:44:05.831123 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8408c48-989f-4f69-b388-c8d1d0e2e8ea" path="/var/lib/kubelet/pods/b8408c48-989f-4f69-b388-c8d1d0e2e8ea/volumes" Mar 20 13:44:06 crc kubenswrapper[4856]: I0320 13:44:06.087384 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:06 crc kubenswrapper[4856]: E0320 13:44:06.087540 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:06 crc kubenswrapper[4856]: E0320 13:44:06.087557 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:06 crc kubenswrapper[4856]: E0320 13:44:06.087603 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:10.087589812 +0000 UTC m=+1264.968615942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:06 crc kubenswrapper[4856]: I0320 13:44:06.234659 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:44:07 crc kubenswrapper[4856]: I0320 13:44:07.418437 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:44:07 crc kubenswrapper[4856]: I0320 13:44:07.418742 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:44:07 crc kubenswrapper[4856]: I0320 13:44:07.524742 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:44:08 crc kubenswrapper[4856]: I0320 13:44:08.310306 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hmzq" event={"ID":"a0094c63-b84f-4a1c-839b-47da04da9efb","Type":"ContainerStarted","Data":"c1855580ac0c21798b3e4dc54c1cbf54ac5da36e0b5f20f530bd19ccb88ac3c7"} Mar 20 13:44:08 crc kubenswrapper[4856]: I0320 13:44:08.351858 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-6hmzq" podStartSLOduration=2.32500641 podStartE2EDuration="6.351829751s" podCreationTimestamp="2026-03-20 13:44:02 +0000 UTC" firstStartedPulling="2026-03-20 13:44:03.796501609 +0000 UTC m=+1258.677527759" lastFinishedPulling="2026-03-20 13:44:07.82332497 +0000 UTC m=+1262.704351100" observedRunningTime="2026-03-20 13:44:08.335732631 +0000 UTC m=+1263.216758821" watchObservedRunningTime="2026-03-20 13:44:08.351829751 +0000 UTC m=+1263.232855921" Mar 20 13:44:08 crc kubenswrapper[4856]: I0320 13:44:08.420197 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.044338 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-rv7l2"] Mar 20 13:44:10 crc kubenswrapper[4856]: E0320 13:44:10.045187 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e26cfc8-a8ce-4db8-bc87-f57d1d209731" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.045207 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e26cfc8-a8ce-4db8-bc87-f57d1d209731" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.045586 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e26cfc8-a8ce-4db8-bc87-f57d1d209731" containerName="oc" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.046421 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.055742 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rv7l2"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.144839 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4b4e-account-create-update-fk89g"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.146146 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.148525 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.152878 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4b4e-account-create-update-fk89g"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.160387 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.160471 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.160513 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgxmt\" (UniqueName: \"kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: E0320 13:44:10.160682 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:10 crc kubenswrapper[4856]: E0320 13:44:10.160701 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:10 crc kubenswrapper[4856]: E0320 13:44:10.160748 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:18.160729162 +0000 UTC m=+1273.041755302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.258519 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-24c45"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.259501 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.261599 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.261672 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gvvw\" (UniqueName: \"kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.261735 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgxmt\" (UniqueName: \"kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.261777 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.262342 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.279675 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-24c45"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.285231 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgxmt\" (UniqueName: \"kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt\") pod \"keystone-db-create-rv7l2\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.332234 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6441-account-create-update-cktwb"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.334414 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.336944 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.343009 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6441-account-create-update-cktwb"] Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.365569 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.365648 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.365752 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gvvw\" (UniqueName: \"kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.365778 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p2gp\" (UniqueName: \"kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.366776 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.382491 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gvvw\" (UniqueName: \"kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw\") pod \"keystone-4b4e-account-create-update-fk89g\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.416519 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467008 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467092 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqsj\" (UniqueName: \"kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467129 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p2gp\" (UniqueName: \"kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467162 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467785 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.467899 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.512845 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p2gp\" (UniqueName: \"kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp\") pod \"placement-db-create-24c45\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.573145 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqsj\" (UniqueName: \"kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.573262 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.574314 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.576655 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-24c45" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.589933 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqsj\" (UniqueName: \"kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj\") pod \"placement-6441-account-create-update-cktwb\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.652808 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.824360 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-rv7l2"] Mar 20 13:44:10 crc kubenswrapper[4856]: W0320 13:44:10.826893 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod623925cd_b615_49a0_be71_08bf412fdc92.slice/crio-c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa WatchSource:0}: Error finding container c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa: Status 404 returned error can't find the container with id c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa Mar 20 13:44:10 crc kubenswrapper[4856]: W0320 13:44:10.920542 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64bb4e5_da79_4c81_b430_1d65747c1d37.slice/crio-532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950 WatchSource:0}: Error finding container 532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950: Status 404 returned error can't find the container with id 532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950 Mar 20 13:44:10 crc kubenswrapper[4856]: I0320 13:44:10.922524 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4b4e-account-create-update-fk89g"] Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.025915 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-24c45"] Mar 20 13:44:11 crc kubenswrapper[4856]: W0320 13:44:11.040986 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7464638_d349_45b6_86af_1a5cd9f7664f.slice/crio-451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959 WatchSource:0}: Error finding container 451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959: Status 404 returned error can't find the container with id 451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959 Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.138492 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6441-account-create-update-cktwb"] Mar 20 13:44:11 crc kubenswrapper[4856]: W0320 13:44:11.142197 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d40f3b7_d738_43d5_aa70_943d6c2afd59.slice/crio-fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505 WatchSource:0}: Error finding container fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505: Status 404 returned error can't find the container with id fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505 Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.338468 4856 generic.go:334] "Generic (PLEG): container finished" podID="623925cd-b615-49a0-be71-08bf412fdc92" containerID="d9659d1018001e10fbb34682f5c5367a558d71cd9ab058ad5de9f78c74b341b6" exitCode=0 Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.338531 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rv7l2" event={"ID":"623925cd-b615-49a0-be71-08bf412fdc92","Type":"ContainerDied","Data":"d9659d1018001e10fbb34682f5c5367a558d71cd9ab058ad5de9f78c74b341b6"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.338559 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rv7l2" event={"ID":"623925cd-b615-49a0-be71-08bf412fdc92","Type":"ContainerStarted","Data":"c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.340251 4856 generic.go:334] "Generic (PLEG): container finished" podID="f64bb4e5-da79-4c81-b430-1d65747c1d37" containerID="36fcff4fedd17a8db136e70258f4b74c385ee450d44580a92885487081aa1af7" exitCode=0 Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.340319 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4b4e-account-create-update-fk89g" event={"ID":"f64bb4e5-da79-4c81-b430-1d65747c1d37","Type":"ContainerDied","Data":"36fcff4fedd17a8db136e70258f4b74c385ee450d44580a92885487081aa1af7"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.340340 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4b4e-account-create-update-fk89g" event={"ID":"f64bb4e5-da79-4c81-b430-1d65747c1d37","Type":"ContainerStarted","Data":"532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.342472 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-24c45" event={"ID":"f7464638-d349-45b6-86af-1a5cd9f7664f","Type":"ContainerStarted","Data":"942d84a14a4be4dfa3b0b57021f39fb11626ccf0b4b7bb7010b8dc07df4835fb"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.342500 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-24c45" event={"ID":"f7464638-d349-45b6-86af-1a5cd9f7664f","Type":"ContainerStarted","Data":"451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.343727 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-cktwb" event={"ID":"6d40f3b7-d738-43d5-aa70-943d6c2afd59","Type":"ContainerStarted","Data":"fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505"} Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.364345 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-24c45" podStartSLOduration=1.364326899 podStartE2EDuration="1.364326899s" podCreationTimestamp="2026-03-20 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:11.362712244 +0000 UTC m=+1266.243738394" watchObservedRunningTime="2026-03-20 13:44:11.364326899 +0000 UTC m=+1266.245353029" Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.565310 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.616809 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:44:11 crc kubenswrapper[4856]: I0320 13:44:11.617088 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="dnsmasq-dns" containerID="cri-o://1da59b2bf0b7fc2f6334ec02d04bce04abeaed57e52e5607324784c41c618ed5" gracePeriod=10 Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.353540 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-cktwb" event={"ID":"6d40f3b7-d738-43d5-aa70-943d6c2afd59","Type":"ContainerStarted","Data":"1301da16b3fd93ee4281a8dd4293e0bcc97da5a9551305c02170570ec4f1d44a"} Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.355356 4856 generic.go:334] "Generic (PLEG): container finished" podID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerID="1da59b2bf0b7fc2f6334ec02d04bce04abeaed57e52e5607324784c41c618ed5" exitCode=0 Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.355392 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" event={"ID":"2dad0b1a-c1e1-47b1-b701-85a690f10481","Type":"ContainerDied","Data":"1da59b2bf0b7fc2f6334ec02d04bce04abeaed57e52e5607324784c41c618ed5"} Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.357796 4856 generic.go:334] "Generic (PLEG): container finished" podID="f7464638-d349-45b6-86af-1a5cd9f7664f" containerID="942d84a14a4be4dfa3b0b57021f39fb11626ccf0b4b7bb7010b8dc07df4835fb" exitCode=0 Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.357849 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-24c45" event={"ID":"f7464638-d349-45b6-86af-1a5cd9f7664f","Type":"ContainerDied","Data":"942d84a14a4be4dfa3b0b57021f39fb11626ccf0b4b7bb7010b8dc07df4835fb"} Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.375558 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6441-account-create-update-cktwb" podStartSLOduration=2.375542177 podStartE2EDuration="2.375542177s" podCreationTimestamp="2026-03-20 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:12.370019566 +0000 UTC m=+1267.251045716" watchObservedRunningTime="2026-03-20 13:44:12.375542177 +0000 UTC m=+1267.256568307" Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.878044 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.889510 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.913363 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts\") pod \"623925cd-b615-49a0-be71-08bf412fdc92\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.913604 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgxmt\" (UniqueName: \"kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt\") pod \"623925cd-b615-49a0-be71-08bf412fdc92\" (UID: \"623925cd-b615-49a0-be71-08bf412fdc92\") " Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.915149 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "623925cd-b615-49a0-be71-08bf412fdc92" (UID: "623925cd-b615-49a0-be71-08bf412fdc92"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:12 crc kubenswrapper[4856]: I0320 13:44:12.926076 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt" (OuterVolumeSpecName: "kube-api-access-pgxmt") pod "623925cd-b615-49a0-be71-08bf412fdc92" (UID: "623925cd-b615-49a0-be71-08bf412fdc92"). InnerVolumeSpecName "kube-api-access-pgxmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.015136 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts\") pod \"f64bb4e5-da79-4c81-b430-1d65747c1d37\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.015220 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gvvw\" (UniqueName: \"kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw\") pod \"f64bb4e5-da79-4c81-b430-1d65747c1d37\" (UID: \"f64bb4e5-da79-4c81-b430-1d65747c1d37\") " Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.015609 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgxmt\" (UniqueName: \"kubernetes.io/projected/623925cd-b615-49a0-be71-08bf412fdc92-kube-api-access-pgxmt\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.015625 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/623925cd-b615-49a0-be71-08bf412fdc92-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.015662 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f64bb4e5-da79-4c81-b430-1d65747c1d37" (UID: "f64bb4e5-da79-4c81-b430-1d65747c1d37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.018847 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw" (OuterVolumeSpecName: "kube-api-access-5gvvw") pod "f64bb4e5-da79-4c81-b430-1d65747c1d37" (UID: "f64bb4e5-da79-4c81-b430-1d65747c1d37"). InnerVolumeSpecName "kube-api-access-5gvvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.117944 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f64bb4e5-da79-4c81-b430-1d65747c1d37-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.118013 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gvvw\" (UniqueName: \"kubernetes.io/projected/f64bb4e5-da79-4c81-b430-1d65747c1d37-kube-api-access-5gvvw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.366973 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-rv7l2" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.367011 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-rv7l2" event={"ID":"623925cd-b615-49a0-be71-08bf412fdc92","Type":"ContainerDied","Data":"c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa"} Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.367054 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8c9333474939ea4de10c110c4187607b0fc9ca36294f5c0063bef822a8d33aa" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.370106 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-4b4e-account-create-update-fk89g" event={"ID":"f64bb4e5-da79-4c81-b430-1d65747c1d37","Type":"ContainerDied","Data":"532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950"} Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.370165 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="532dfccca08fa8a84509c52611f755a03750e0682460e2efe85931654c4e2950" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.370328 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-fk89g" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.629876 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-24c45" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.729193 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p2gp\" (UniqueName: \"kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp\") pod \"f7464638-d349-45b6-86af-1a5cd9f7664f\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.729353 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts\") pod \"f7464638-d349-45b6-86af-1a5cd9f7664f\" (UID: \"f7464638-d349-45b6-86af-1a5cd9f7664f\") " Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.729931 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7464638-d349-45b6-86af-1a5cd9f7664f" (UID: "f7464638-d349-45b6-86af-1a5cd9f7664f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.730754 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7464638-d349-45b6-86af-1a5cd9f7664f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.732942 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp" (OuterVolumeSpecName: "kube-api-access-8p2gp") pod "f7464638-d349-45b6-86af-1a5cd9f7664f" (UID: "f7464638-d349-45b6-86af-1a5cd9f7664f"). InnerVolumeSpecName "kube-api-access-8p2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:13 crc kubenswrapper[4856]: I0320 13:44:13.831585 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p2gp\" (UniqueName: \"kubernetes.io/projected/f7464638-d349-45b6-86af-1a5cd9f7664f-kube-api-access-8p2gp\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279020 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2mn42"] Mar 20 13:44:14 crc kubenswrapper[4856]: E0320 13:44:14.279598 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64bb4e5-da79-4c81-b430-1d65747c1d37" containerName="mariadb-account-create-update" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279619 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64bb4e5-da79-4c81-b430-1d65747c1d37" containerName="mariadb-account-create-update" Mar 20 13:44:14 crc kubenswrapper[4856]: E0320 13:44:14.279641 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="623925cd-b615-49a0-be71-08bf412fdc92" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279650 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="623925cd-b615-49a0-be71-08bf412fdc92" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: E0320 13:44:14.279680 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7464638-d349-45b6-86af-1a5cd9f7664f" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279686 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7464638-d349-45b6-86af-1a5cd9f7664f" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279841 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="623925cd-b615-49a0-be71-08bf412fdc92" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279863 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7464638-d349-45b6-86af-1a5cd9f7664f" containerName="mariadb-database-create" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.279872 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64bb4e5-da79-4c81-b430-1d65747c1d37" containerName="mariadb-account-create-update" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.280392 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.291857 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2mn42"] Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.338366 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.338441 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-298bh\" (UniqueName: \"kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.385595 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-812f-account-create-update-7d2cz"] Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.386801 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.390645 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.394595 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-24c45" event={"ID":"f7464638-d349-45b6-86af-1a5cd9f7664f","Type":"ContainerDied","Data":"451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959"} Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.394639 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="451bae05b0d9f8870f27fc5677a0db73f33f335fdf61e310c00413cb74864959" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.394687 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-24c45" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.412887 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-812f-account-create-update-7d2cz"] Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.439516 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plmmv\" (UniqueName: \"kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.439662 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.439699 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.439736 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-298bh\" (UniqueName: \"kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.440822 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.461085 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-298bh\" (UniqueName: \"kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh\") pod \"glance-db-create-2mn42\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.541334 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.541410 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plmmv\" (UniqueName: \"kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.542352 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.556410 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plmmv\" (UniqueName: \"kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv\") pod \"glance-812f-account-create-update-7d2cz\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.602050 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2mn42" Mar 20 13:44:14 crc kubenswrapper[4856]: I0320 13:44:14.719471 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:15 crc kubenswrapper[4856]: I0320 13:44:15.069077 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2mn42"] Mar 20 13:44:15 crc kubenswrapper[4856]: I0320 13:44:15.158587 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-812f-account-create-update-7d2cz"] Mar 20 13:44:15 crc kubenswrapper[4856]: I0320 13:44:15.402506 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-812f-account-create-update-7d2cz" event={"ID":"0081bda3-3f0b-4c1e-a7ab-90af9235521f","Type":"ContainerStarted","Data":"c2b85719c1d0210f39e49b9b48d39173be34a0937cc408897683bd3a0f5668f4"} Mar 20 13:44:15 crc kubenswrapper[4856]: I0320 13:44:15.403637 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2mn42" event={"ID":"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de","Type":"ContainerStarted","Data":"623e7237b7a9d1e46c3a4d31ccdf28367dc95df67eda792773df4aaa8cce25af"} Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.060653 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gbjqm"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.069845 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gbjqm"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.141918 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-27vk4"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.142893 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.145432 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.152443 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-27vk4"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.170019 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42l4\" (UniqueName: \"kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.170398 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.266564 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.271707 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w42l4\" (UniqueName: \"kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.271832 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.272538 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.290723 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w42l4\" (UniqueName: \"kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4\") pod \"root-account-create-update-27vk4\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.373366 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc\") pod \"2dad0b1a-c1e1-47b1-b701-85a690f10481\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.373557 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config\") pod \"2dad0b1a-c1e1-47b1-b701-85a690f10481\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.373671 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb\") pod \"2dad0b1a-c1e1-47b1-b701-85a690f10481\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.373732 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-566qd\" (UniqueName: \"kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd\") pod \"2dad0b1a-c1e1-47b1-b701-85a690f10481\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.373782 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb\") pod \"2dad0b1a-c1e1-47b1-b701-85a690f10481\" (UID: \"2dad0b1a-c1e1-47b1-b701-85a690f10481\") " Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.377371 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd" (OuterVolumeSpecName: "kube-api-access-566qd") pod "2dad0b1a-c1e1-47b1-b701-85a690f10481" (UID: "2dad0b1a-c1e1-47b1-b701-85a690f10481"). InnerVolumeSpecName "kube-api-access-566qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.416428 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2dad0b1a-c1e1-47b1-b701-85a690f10481" (UID: "2dad0b1a-c1e1-47b1-b701-85a690f10481"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.420514 4856 generic.go:334] "Generic (PLEG): container finished" podID="6d40f3b7-d738-43d5-aa70-943d6c2afd59" containerID="1301da16b3fd93ee4281a8dd4293e0bcc97da5a9551305c02170570ec4f1d44a" exitCode=0 Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.420586 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-cktwb" event={"ID":"6d40f3b7-d738-43d5-aa70-943d6c2afd59","Type":"ContainerDied","Data":"1301da16b3fd93ee4281a8dd4293e0bcc97da5a9551305c02170570ec4f1d44a"} Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.420760 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2dad0b1a-c1e1-47b1-b701-85a690f10481" (UID: "2dad0b1a-c1e1-47b1-b701-85a690f10481"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.422659 4856 generic.go:334] "Generic (PLEG): container finished" podID="2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" containerID="e0baf76a23d10f6480a418e705166277839f259f203f9d0a350f3e3805d573a6" exitCode=0 Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.422726 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2mn42" event={"ID":"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de","Type":"ContainerDied","Data":"e0baf76a23d10f6480a418e705166277839f259f203f9d0a350f3e3805d573a6"} Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.425012 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" event={"ID":"2dad0b1a-c1e1-47b1-b701-85a690f10481","Type":"ContainerDied","Data":"f5cad51a613774c6dde89d5faea37ea7e3c1feff7d56b50e90a84a4ab7594424"} Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.425071 4856 scope.go:117] "RemoveContainer" containerID="1da59b2bf0b7fc2f6334ec02d04bce04abeaed57e52e5607324784c41c618ed5" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.425363 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.429076 4856 generic.go:334] "Generic (PLEG): container finished" podID="0081bda3-3f0b-4c1e-a7ab-90af9235521f" containerID="e44988e1ef016b282859d3b5848743fd242e2ba8424bb50bbbf315c770a62806" exitCode=0 Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.429196 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-812f-account-create-update-7d2cz" event={"ID":"0081bda3-3f0b-4c1e-a7ab-90af9235521f","Type":"ContainerDied","Data":"e44988e1ef016b282859d3b5848743fd242e2ba8424bb50bbbf315c770a62806"} Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.453157 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2dad0b1a-c1e1-47b1-b701-85a690f10481" (UID: "2dad0b1a-c1e1-47b1-b701-85a690f10481"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.476593 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-566qd\" (UniqueName: \"kubernetes.io/projected/2dad0b1a-c1e1-47b1-b701-85a690f10481-kube-api-access-566qd\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.476629 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.476642 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.476655 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.487450 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config" (OuterVolumeSpecName: "config") pod "2dad0b1a-c1e1-47b1-b701-85a690f10481" (UID: "2dad0b1a-c1e1-47b1-b701-85a690f10481"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.528998 4856 scope.go:117] "RemoveContainer" containerID="111338fccc15513427c7a63513a3b2bd06b1200efc391692ba800b94dd3cac6c" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.563747 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.580735 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dad0b1a-c1e1-47b1-b701-85a690f10481-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.757986 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.771838 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-fnsbs"] Mar 20 13:44:16 crc kubenswrapper[4856]: I0320 13:44:16.978037 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-27vk4"] Mar 20 13:44:16 crc kubenswrapper[4856]: W0320 13:44:16.980031 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde5a94fd_f271_4229_be43_98ca4a079573.slice/crio-8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519 WatchSource:0}: Error finding container 8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519: Status 404 returned error can't find the container with id 8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519 Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.439368 4856 generic.go:334] "Generic (PLEG): container finished" podID="de5a94fd-f271-4229-be43-98ca4a079573" containerID="37fa27fb17bdccebcd671a7ee5c397835628b1892f2a54a0c7afe739f659be86" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.439549 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27vk4" event={"ID":"de5a94fd-f271-4229-be43-98ca4a079573","Type":"ContainerDied","Data":"37fa27fb17bdccebcd671a7ee5c397835628b1892f2a54a0c7afe739f659be86"} Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.439755 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27vk4" event={"ID":"de5a94fd-f271-4229-be43-98ca4a079573","Type":"ContainerStarted","Data":"8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519"} Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.831229 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" path="/var/lib/kubelet/pods/2dad0b1a-c1e1-47b1-b701-85a690f10481/volumes" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.831932 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f1e6b2-2fc6-4676-8f63-3c9ee5d097da" path="/var/lib/kubelet/pods/40f1e6b2-2fc6-4676-8f63-3c9ee5d097da/volumes" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.837581 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.904117 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts\") pod \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.904244 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqsj\" (UniqueName: \"kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj\") pod \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\" (UID: \"6d40f3b7-d738-43d5-aa70-943d6c2afd59\") " Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.912009 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d40f3b7-d738-43d5-aa70-943d6c2afd59" (UID: "6d40f3b7-d738-43d5-aa70-943d6c2afd59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.912106 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj" (OuterVolumeSpecName: "kube-api-access-2sqsj") pod "6d40f3b7-d738-43d5-aa70-943d6c2afd59" (UID: "6d40f3b7-d738-43d5-aa70-943d6c2afd59"). InnerVolumeSpecName "kube-api-access-2sqsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.944351 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.954855 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2mn42" Mar 20 13:44:17 crc kubenswrapper[4856]: I0320 13:44:17.988761 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.005414 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts\") pod \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.005551 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plmmv\" (UniqueName: \"kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv\") pod \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.005587 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts\") pod \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\" (UID: \"0081bda3-3f0b-4c1e-a7ab-90af9235521f\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.005660 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-298bh\" (UniqueName: \"kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh\") pod \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\" (UID: \"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.005891 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" (UID: "2e64b873-3eb0-418d-9fe3-08d8cd4ea6de"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.006185 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d40f3b7-d738-43d5-aa70-943d6c2afd59-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.006207 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqsj\" (UniqueName: \"kubernetes.io/projected/6d40f3b7-d738-43d5-aa70-943d6c2afd59-kube-api-access-2sqsj\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.006224 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.006749 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0081bda3-3f0b-4c1e-a7ab-90af9235521f" (UID: "0081bda3-3f0b-4c1e-a7ab-90af9235521f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.009846 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh" (OuterVolumeSpecName: "kube-api-access-298bh") pod "2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" (UID: "2e64b873-3eb0-418d-9fe3-08d8cd4ea6de"). InnerVolumeSpecName "kube-api-access-298bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.010359 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv" (OuterVolumeSpecName: "kube-api-access-plmmv") pod "0081bda3-3f0b-4c1e-a7ab-90af9235521f" (UID: "0081bda3-3f0b-4c1e-a7ab-90af9235521f"). InnerVolumeSpecName "kube-api-access-plmmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.110058 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plmmv\" (UniqueName: \"kubernetes.io/projected/0081bda3-3f0b-4c1e-a7ab-90af9235521f-kube-api-access-plmmv\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.110545 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0081bda3-3f0b-4c1e-a7ab-90af9235521f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.110559 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-298bh\" (UniqueName: \"kubernetes.io/projected/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de-kube-api-access-298bh\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.211913 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:18 crc kubenswrapper[4856]: E0320 13:44:18.212135 4856 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:18 crc kubenswrapper[4856]: E0320 13:44:18.212153 4856 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:18 crc kubenswrapper[4856]: E0320 13:44:18.212212 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift podName:179c29eb-c606-4429-8bbd-f7a4f62790f9 nodeName:}" failed. No retries permitted until 2026-03-20 13:44:34.212192404 +0000 UTC m=+1289.093218534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift") pod "swift-storage-0" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9") : configmap "swift-ring-files" not found Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.449088 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-812f-account-create-update-7d2cz" event={"ID":"0081bda3-3f0b-4c1e-a7ab-90af9235521f","Type":"ContainerDied","Data":"c2b85719c1d0210f39e49b9b48d39173be34a0937cc408897683bd3a0f5668f4"} Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.449149 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2b85719c1d0210f39e49b9b48d39173be34a0937cc408897683bd3a0f5668f4" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.450381 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-7d2cz" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.454114 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-cktwb" event={"ID":"6d40f3b7-d738-43d5-aa70-943d6c2afd59","Type":"ContainerDied","Data":"fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505"} Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.454143 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb1dae35937e77f4e0674ded48c343409238b280348fc30253feb7dcffbb505" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.454190 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-cktwb" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.460518 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2mn42" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.464341 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2mn42" event={"ID":"2e64b873-3eb0-418d-9fe3-08d8cd4ea6de","Type":"ContainerDied","Data":"623e7237b7a9d1e46c3a4d31ccdf28367dc95df67eda792773df4aaa8cce25af"} Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.464375 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="623e7237b7a9d1e46c3a4d31ccdf28367dc95df67eda792773df4aaa8cce25af" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.810883 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.921865 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts\") pod \"de5a94fd-f271-4229-be43-98ca4a079573\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.921968 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w42l4\" (UniqueName: \"kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4\") pod \"de5a94fd-f271-4229-be43-98ca4a079573\" (UID: \"de5a94fd-f271-4229-be43-98ca4a079573\") " Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.922460 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de5a94fd-f271-4229-be43-98ca4a079573" (UID: "de5a94fd-f271-4229-be43-98ca4a079573"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4856]: I0320 13:44:18.926633 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4" (OuterVolumeSpecName: "kube-api-access-w42l4") pod "de5a94fd-f271-4229-be43-98ca4a079573" (UID: "de5a94fd-f271-4229-be43-98ca4a079573"). InnerVolumeSpecName "kube-api-access-w42l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.024682 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de5a94fd-f271-4229-be43-98ca4a079573-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.025038 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w42l4\" (UniqueName: \"kubernetes.io/projected/de5a94fd-f271-4229-be43-98ca4a079573-kube-api-access-w42l4\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.471463 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-27vk4" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.471463 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-27vk4" event={"ID":"de5a94fd-f271-4229-be43-98ca4a079573","Type":"ContainerDied","Data":"8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519"} Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.471627 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a1ff6266a6b42893133956b4223fe9969a8e36656bb7a1babe92aa5fd1d5519" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.474112 4856 generic.go:334] "Generic (PLEG): container finished" podID="a0094c63-b84f-4a1c-839b-47da04da9efb" containerID="c1855580ac0c21798b3e4dc54c1cbf54ac5da36e0b5f20f530bd19ccb88ac3c7" exitCode=0 Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.474208 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hmzq" event={"ID":"a0094c63-b84f-4a1c-839b-47da04da9efb","Type":"ContainerDied","Data":"c1855580ac0c21798b3e4dc54c1cbf54ac5da36e0b5f20f530bd19ccb88ac3c7"} Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.618233 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-76pgk"] Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.619371 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="dnsmasq-dns" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.619472 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="dnsmasq-dns" Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.619605 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" containerName="mariadb-database-create" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.619688 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" containerName="mariadb-database-create" Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.619782 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d40f3b7-d738-43d5-aa70-943d6c2afd59" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.619874 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d40f3b7-d738-43d5-aa70-943d6c2afd59" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.619986 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0081bda3-3f0b-4c1e-a7ab-90af9235521f" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620078 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0081bda3-3f0b-4c1e-a7ab-90af9235521f" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.620160 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de5a94fd-f271-4229-be43-98ca4a079573" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620247 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="de5a94fd-f271-4229-be43-98ca4a079573" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: E0320 13:44:19.620368 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="init" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620438 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="init" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620742 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0081bda3-3f0b-4c1e-a7ab-90af9235521f" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620842 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" containerName="mariadb-database-create" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.620922 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d40f3b7-d738-43d5-aa70-943d6c2afd59" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.621003 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="de5a94fd-f271-4229-be43-98ca4a079573" containerName="mariadb-account-create-update" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.621086 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="dnsmasq-dns" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.621849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.624977 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6xjtb" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.625320 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.646347 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-76pgk"] Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.736035 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.736135 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.736182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.736220 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swqf8\" (UniqueName: \"kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.837995 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.838086 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.838162 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swqf8\" (UniqueName: \"kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.838350 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.844863 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.844976 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.845219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.864732 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swqf8\" (UniqueName: \"kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8\") pod \"glance-db-sync-76pgk\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:19 crc kubenswrapper[4856]: I0320 13:44:19.940409 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.439910 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-76pgk"] Mar 20 13:44:20 crc kubenswrapper[4856]: W0320 13:44:20.444474 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c34367e_1bb1_4e1d_8a11_190bca797f8e.slice/crio-b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4 WatchSource:0}: Error finding container b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4: Status 404 returned error can't find the container with id b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4 Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.482811 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-76pgk" event={"ID":"8c34367e-1bb1-4e1d-8a11-190bca797f8e","Type":"ContainerStarted","Data":"b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4"} Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.524396 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c697k" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:44:20 crc kubenswrapper[4856]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:44:20 crc kubenswrapper[4856]: > Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.561146 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.591543 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.709998 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.753958 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754089 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754144 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754184 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754247 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754307 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.754334 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l97\" (UniqueName: \"kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97\") pod \"a0094c63-b84f-4a1c-839b-47da04da9efb\" (UID: \"a0094c63-b84f-4a1c-839b-47da04da9efb\") " Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.755492 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.755745 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.759725 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97" (OuterVolumeSpecName: "kube-api-access-w7l97") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "kube-api-access-w7l97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.762854 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.780537 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.780581 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.791005 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts" (OuterVolumeSpecName: "scripts") pod "a0094c63-b84f-4a1c-839b-47da04da9efb" (UID: "a0094c63-b84f-4a1c-839b-47da04da9efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.803194 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c697k-config-6dg42"] Mar 20 13:44:20 crc kubenswrapper[4856]: E0320 13:44:20.806689 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0094c63-b84f-4a1c-839b-47da04da9efb" containerName="swift-ring-rebalance" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.806733 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0094c63-b84f-4a1c-839b-47da04da9efb" containerName="swift-ring-rebalance" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.807048 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0094c63-b84f-4a1c-839b-47da04da9efb" containerName="swift-ring-rebalance" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.807803 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.809927 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.814470 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k-config-6dg42"] Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.856216 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.857644 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvl5g\" (UniqueName: \"kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.857731 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858057 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858152 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858244 4856 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a0094c63-b84f-4a1c-839b-47da04da9efb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858258 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858287 4856 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858300 4856 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a0094c63-b84f-4a1c-839b-47da04da9efb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858313 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858324 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l97\" (UniqueName: \"kubernetes.io/projected/a0094c63-b84f-4a1c-839b-47da04da9efb-kube-api-access-w7l97\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.858336 4856 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a0094c63-b84f-4a1c-839b-47da04da9efb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959180 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959244 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959338 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959367 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959416 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvl5g\" (UniqueName: \"kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959448 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959714 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959762 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959917 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.959977 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.961331 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:20 crc kubenswrapper[4856]: I0320 13:44:20.979033 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvl5g\" (UniqueName: \"kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g\") pod \"ovn-controller-c697k-config-6dg42\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.129972 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.232650 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-86db49b7ff-fnsbs" podUID="2dad0b1a-c1e1-47b1-b701-85a690f10481" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.502904 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-6hmzq" Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.502990 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-6hmzq" event={"ID":"a0094c63-b84f-4a1c-839b-47da04da9efb","Type":"ContainerDied","Data":"90fd1ff6b40a1a17d3768a310806201b51880c924814918d5183b6a7e8964881"} Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.503541 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90fd1ff6b40a1a17d3768a310806201b51880c924814918d5183b6a7e8964881" Mar 20 13:44:21 crc kubenswrapper[4856]: I0320 13:44:21.556562 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k-config-6dg42"] Mar 20 13:44:21 crc kubenswrapper[4856]: W0320 13:44:21.559221 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4c2e7f9_6a4e_45c0_89f2_5fc3621b95a0.slice/crio-8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc WatchSource:0}: Error finding container 8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc: Status 404 returned error can't find the container with id 8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc Mar 20 13:44:22 crc kubenswrapper[4856]: I0320 13:44:22.115502 4856 scope.go:117] "RemoveContainer" containerID="c5d04cfd30d25aaf9a98a7aa45f4088204b21ee4445209e91e098b48ea2f7729" Mar 20 13:44:22 crc kubenswrapper[4856]: I0320 13:44:22.518330 4856 generic.go:334] "Generic (PLEG): container finished" podID="a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" containerID="e508e1652b92ae1e87caf70b07b8b6ca94d53d51788493cfc4280c00f1d42a1f" exitCode=0 Mar 20 13:44:22 crc kubenswrapper[4856]: I0320 13:44:22.518396 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-6dg42" event={"ID":"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0","Type":"ContainerDied","Data":"e508e1652b92ae1e87caf70b07b8b6ca94d53d51788493cfc4280c00f1d42a1f"} Mar 20 13:44:22 crc kubenswrapper[4856]: I0320 13:44:22.518434 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-6dg42" event={"ID":"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0","Type":"ContainerStarted","Data":"8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc"} Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.870680 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916010 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916288 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916520 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvl5g\" (UniqueName: \"kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916563 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916606 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.916702 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts\") pod \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\" (UID: \"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0\") " Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.918628 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts" (OuterVolumeSpecName: "scripts") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.918694 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.918761 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run" (OuterVolumeSpecName: "var-run") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.919189 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.920092 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4856]: I0320 13:44:23.928600 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g" (OuterVolumeSpecName: "kube-api-access-lvl5g") pod "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" (UID: "a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0"). InnerVolumeSpecName "kube-api-access-lvl5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019495 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvl5g\" (UniqueName: \"kubernetes.io/projected/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-kube-api-access-lvl5g\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019540 4856 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019561 4856 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019575 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019586 4856 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.019595 4856 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.535011 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-6dg42" event={"ID":"a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0","Type":"ContainerDied","Data":"8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc"} Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.535340 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ea6eb1a7d3517f2a1bb9181ca6321834039b36b8ea76e40fd66ecdf0232b8cc" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.535148 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-6dg42" Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.950113 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c697k-config-6dg42"] Mar 20 13:44:24 crc kubenswrapper[4856]: I0320 13:44:24.962732 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c697k-config-6dg42"] Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.079226 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c697k-config-cz2b7"] Mar 20 13:44:25 crc kubenswrapper[4856]: E0320 13:44:25.079580 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" containerName="ovn-config" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.079592 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" containerName="ovn-config" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.079780 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" containerName="ovn-config" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.080254 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.087757 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.092636 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k-config-cz2b7"] Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141572 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141657 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tz4\" (UniqueName: \"kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141693 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141735 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141822 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.141875 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243246 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243345 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tz4\" (UniqueName: \"kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243373 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243400 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243460 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243503 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243747 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243816 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.243880 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.244665 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.245972 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.262678 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tz4\" (UniqueName: \"kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4\") pod \"ovn-controller-c697k-config-cz2b7\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.398025 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.520109 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-c697k" Mar 20 13:44:25 crc kubenswrapper[4856]: I0320 13:44:25.834087 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0" path="/var/lib/kubelet/pods/a4c2e7f9-6a4e-45c0-89f2-5fc3621b95a0/volumes" Mar 20 13:44:28 crc kubenswrapper[4856]: I0320 13:44:28.568848 4856 generic.go:334] "Generic (PLEG): container finished" podID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerID="5ab9a98d399750f4cde1f783ae91e1d160bff7a9ccceea461dbfd346e6e256f6" exitCode=0 Mar 20 13:44:28 crc kubenswrapper[4856]: I0320 13:44:28.568970 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerDied","Data":"5ab9a98d399750f4cde1f783ae91e1d160bff7a9ccceea461dbfd346e6e256f6"} Mar 20 13:44:28 crc kubenswrapper[4856]: I0320 13:44:28.572648 4856 generic.go:334] "Generic (PLEG): container finished" podID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerID="9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630" exitCode=0 Mar 20 13:44:28 crc kubenswrapper[4856]: I0320 13:44:28.572699 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerDied","Data":"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630"} Mar 20 13:44:32 crc kubenswrapper[4856]: I0320 13:44:32.544124 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c697k-config-cz2b7"] Mar 20 13:44:32 crc kubenswrapper[4856]: W0320 13:44:32.544585 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1227442a_7077_4a27_a984_09cbd96fbc1b.slice/crio-4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11 WatchSource:0}: Error finding container 4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11: Status 404 returned error can't find the container with id 4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11 Mar 20 13:44:32 crc kubenswrapper[4856]: I0320 13:44:32.623763 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-cz2b7" event={"ID":"1227442a-7077-4a27-a984-09cbd96fbc1b","Type":"ContainerStarted","Data":"4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11"} Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.636017 4856 generic.go:334] "Generic (PLEG): container finished" podID="1227442a-7077-4a27-a984-09cbd96fbc1b" containerID="c37597c183ad2be22cd4505258c442e053d875efb09f132479a632f17125202b" exitCode=0 Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.636201 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-cz2b7" event={"ID":"1227442a-7077-4a27-a984-09cbd96fbc1b","Type":"ContainerDied","Data":"c37597c183ad2be22cd4505258c442e053d875efb09f132479a632f17125202b"} Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.641235 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerStarted","Data":"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e"} Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.642128 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.652985 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerStarted","Data":"d2421faa30776943955dd45b063c08f4fd3835932d949de6ba81f54b8fc10d07"} Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.653244 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.683340 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.627445892 podStartE2EDuration="1m29.68329426s" podCreationTimestamp="2026-03-20 13:43:04 +0000 UTC" firstStartedPulling="2026-03-20 13:43:05.921038565 +0000 UTC m=+1200.802064695" lastFinishedPulling="2026-03-20 13:43:54.976886933 +0000 UTC m=+1249.857913063" observedRunningTime="2026-03-20 13:44:33.672744491 +0000 UTC m=+1288.553770641" watchObservedRunningTime="2026-03-20 13:44:33.68329426 +0000 UTC m=+1288.564320420" Mar 20 13:44:33 crc kubenswrapper[4856]: I0320 13:44:33.702009 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.388990792 podStartE2EDuration="1m29.701990611s" podCreationTimestamp="2026-03-20 13:43:04 +0000 UTC" firstStartedPulling="2026-03-20 13:43:06.284904198 +0000 UTC m=+1201.165930328" lastFinishedPulling="2026-03-20 13:43:55.597904017 +0000 UTC m=+1250.478930147" observedRunningTime="2026-03-20 13:44:33.697865039 +0000 UTC m=+1288.578891179" watchObservedRunningTime="2026-03-20 13:44:33.701990611 +0000 UTC m=+1288.583016741" Mar 20 13:44:34 crc kubenswrapper[4856]: I0320 13:44:34.212857 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:34 crc kubenswrapper[4856]: I0320 13:44:34.226381 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"swift-storage-0\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " pod="openstack/swift-storage-0" Mar 20 13:44:34 crc kubenswrapper[4856]: I0320 13:44:34.469991 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:44:34 crc kubenswrapper[4856]: I0320 13:44:34.671674 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-76pgk" event={"ID":"8c34367e-1bb1-4e1d-8a11-190bca797f8e","Type":"ContainerStarted","Data":"70a9aeecd1a7c315226a9a8375517e93271afd2ec56f559f08f46435a6345ac8"} Mar 20 13:44:34 crc kubenswrapper[4856]: I0320 13:44:34.693885 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-76pgk" podStartSLOduration=2.972001444 podStartE2EDuration="15.693039488s" podCreationTimestamp="2026-03-20 13:44:19 +0000 UTC" firstStartedPulling="2026-03-20 13:44:20.446557395 +0000 UTC m=+1275.327583525" lastFinishedPulling="2026-03-20 13:44:33.167595439 +0000 UTC m=+1288.048621569" observedRunningTime="2026-03-20 13:44:34.692265567 +0000 UTC m=+1289.573291717" watchObservedRunningTime="2026-03-20 13:44:34.693039488 +0000 UTC m=+1289.574065618" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.033389 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:35 crc kubenswrapper[4856]: W0320 13:44:35.037537 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod179c29eb_c606_4429_8bbd_f7a4f62790f9.slice/crio-4575417590e14a7fb3606e98da48c8cff9594ce441e6131c18a30115f63ac022 WatchSource:0}: Error finding container 4575417590e14a7fb3606e98da48c8cff9594ce441e6131c18a30115f63ac022: Status 404 returned error can't find the container with id 4575417590e14a7fb3606e98da48c8cff9594ce441e6131c18a30115f63ac022 Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.053433 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.134732 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.134836 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.134868 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.134941 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.134989 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.135049 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tz4\" (UniqueName: \"kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4\") pod \"1227442a-7077-4a27-a984-09cbd96fbc1b\" (UID: \"1227442a-7077-4a27-a984-09cbd96fbc1b\") " Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.135988 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.136061 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.136461 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts" (OuterVolumeSpecName: "scripts") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.136603 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.136000 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run" (OuterVolumeSpecName: "var-run") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.141974 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4" (OuterVolumeSpecName: "kube-api-access-g2tz4") pod "1227442a-7077-4a27-a984-09cbd96fbc1b" (UID: "1227442a-7077-4a27-a984-09cbd96fbc1b"). InnerVolumeSpecName "kube-api-access-g2tz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236801 4856 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236845 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tz4\" (UniqueName: \"kubernetes.io/projected/1227442a-7077-4a27-a984-09cbd96fbc1b-kube-api-access-g2tz4\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236864 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236874 4856 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236885 4856 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1227442a-7077-4a27-a984-09cbd96fbc1b-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.236895 4856 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1227442a-7077-4a27-a984-09cbd96fbc1b-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.680328 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k-config-cz2b7" event={"ID":"1227442a-7077-4a27-a984-09cbd96fbc1b","Type":"ContainerDied","Data":"4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11"} Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.681740 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb58b5b00094bcbc3176549cd02d1f084850833fa0bf1b888b01d97097a6f11" Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.681847 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"4575417590e14a7fb3606e98da48c8cff9594ce441e6131c18a30115f63ac022"} Mar 20 13:44:35 crc kubenswrapper[4856]: I0320 13:44:35.680341 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k-config-cz2b7" Mar 20 13:44:36 crc kubenswrapper[4856]: I0320 13:44:36.134504 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c697k-config-cz2b7"] Mar 20 13:44:36 crc kubenswrapper[4856]: I0320 13:44:36.145785 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c697k-config-cz2b7"] Mar 20 13:44:37 crc kubenswrapper[4856]: I0320 13:44:37.831181 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1227442a-7077-4a27-a984-09cbd96fbc1b" path="/var/lib/kubelet/pods/1227442a-7077-4a27-a984-09cbd96fbc1b/volumes" Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.745856 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"bf17e5b77e3a7a1bdf3ad21621b736ab3a7b00f1bfc4f9f9e63067c32d46e273"} Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.746345 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"7fec7d9d05c7e6a547275f47329f4ae8d58fe68cc7259691ff5842999eacf987"} Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.746357 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"103cdf38f9ed46d33180e33da2171b27c42859dedc97518509e187489741b123"} Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.746366 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"99aed41847b9a822596138e8aef2e4873222dfb5643d9cd387d54e1029fa26ae"} Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.987929 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:39 crc kubenswrapper[4856]: I0320 13:44:39.987990 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:41 crc kubenswrapper[4856]: I0320 13:44:41.764912 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"c25c88d12f3de7091d79c129a51dbb13814ceb3eb7e0f5f552600e9715f22cd3"} Mar 20 13:44:41 crc kubenswrapper[4856]: I0320 13:44:41.766465 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"66ddb19b6cb3cfb47423e82ae3b8ce578f9275b8ddaeaefca9cb0f6db6d03dd4"} Mar 20 13:44:41 crc kubenswrapper[4856]: I0320 13:44:41.766552 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"9708b2248759cc0b809d2397329f741db5de5c3791b0c0d67c59ef3236106ae9"} Mar 20 13:44:41 crc kubenswrapper[4856]: I0320 13:44:41.766620 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"336bef9fbe708a542dba755da9664d99b5431f33ebb505b3d410fc05b0726883"} Mar 20 13:44:43 crc kubenswrapper[4856]: I0320 13:44:43.788909 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"8dc8373e74ff37fef6751f9af5ae4fe48e9297688f83e021db44786f7698fae2"} Mar 20 13:44:43 crc kubenswrapper[4856]: I0320 13:44:43.789485 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"3ced5a863f2674ab088b2cfc34623e28ac2f1620c7f6f8dc4f2edb1bd867f7c6"} Mar 20 13:44:43 crc kubenswrapper[4856]: I0320 13:44:43.789502 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"7324448e1753fd76381dd12b6e7d9dc16d8ab4a8e4930a9aeb6e2de164019847"} Mar 20 13:44:43 crc kubenswrapper[4856]: I0320 13:44:43.789513 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"9c977a756055e2886ad5ca74cb43b1715a8e35dc20df5e2db03dadb213f99ae2"} Mar 20 13:44:44 crc kubenswrapper[4856]: I0320 13:44:44.809629 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"64b602f06351a958adc8f20a603944b2103c33e0a114f4cd698496a6b2cd9d5a"} Mar 20 13:44:44 crc kubenswrapper[4856]: I0320 13:44:44.813092 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"2bbe17d4032ceb3620e99676e538a3436047e64cd35c8e408a7e830c8d0c8916"} Mar 20 13:44:44 crc kubenswrapper[4856]: I0320 13:44:44.813361 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerStarted","Data":"7b0eeecc01033a001f3ec16d0f85af1f0a2b22608ba9b74a4124f80db63f7023"} Mar 20 13:44:44 crc kubenswrapper[4856]: I0320 13:44:44.855938 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.882373426 podStartE2EDuration="43.854847986s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:35.039457584 +0000 UTC m=+1289.920483714" lastFinishedPulling="2026-03-20 13:44:43.011932154 +0000 UTC m=+1297.892958274" observedRunningTime="2026-03-20 13:44:44.850784564 +0000 UTC m=+1299.731810714" watchObservedRunningTime="2026-03-20 13:44:44.854847986 +0000 UTC m=+1299.735874136" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.173771 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:45 crc kubenswrapper[4856]: E0320 13:44:45.174492 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1227442a-7077-4a27-a984-09cbd96fbc1b" containerName="ovn-config" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.174607 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1227442a-7077-4a27-a984-09cbd96fbc1b" containerName="ovn-config" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.174859 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1227442a-7077-4a27-a984-09cbd96fbc1b" containerName="ovn-config" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.175963 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.177826 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.188156 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303210 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303346 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303402 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303425 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303630 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdxsc\" (UniqueName: \"kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.303781 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.405654 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.405768 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.405819 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.405950 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdxsc\" (UniqueName: \"kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.406045 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.406164 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.407610 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.407610 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.407837 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.407841 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.408164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.430226 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdxsc\" (UniqueName: \"kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc\") pod \"dnsmasq-dns-764c5664d7-tj65h\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.434085 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.495926 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.821909 4856 generic.go:334] "Generic (PLEG): container finished" podID="8c34367e-1bb1-4e1d-8a11-190bca797f8e" containerID="70a9aeecd1a7c315226a9a8375517e93271afd2ec56f559f08f46435a6345ac8" exitCode=0 Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.833564 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-76pgk" event={"ID":"8c34367e-1bb1-4e1d-8a11-190bca797f8e","Type":"ContainerDied","Data":"70a9aeecd1a7c315226a9a8375517e93271afd2ec56f559f08f46435a6345ac8"} Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.853035 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:44:45 crc kubenswrapper[4856]: I0320 13:44:45.950464 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:45 crc kubenswrapper[4856]: W0320 13:44:45.957512 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaced2eae_1557_490d_8808_ac1ee0761fb9.slice/crio-4cb699b45f60af94d3c598281432089796b845fab914328dee9cc47dfed1412a WatchSource:0}: Error finding container 4cb699b45f60af94d3c598281432089796b845fab914328dee9cc47dfed1412a: Status 404 returned error can't find the container with id 4cb699b45f60af94d3c598281432089796b845fab914328dee9cc47dfed1412a Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.204817 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-94qws"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.206009 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.241834 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-94qws"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.322674 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.322735 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl7vc\" (UniqueName: \"kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.332493 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fc05-account-create-update-4c8xw"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.334029 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.339423 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.344627 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fc05-account-create-update-4c8xw"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.424321 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.424695 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl7vc\" (UniqueName: \"kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.424828 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.424988 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxg9v\" (UniqueName: \"kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.425126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.442672 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-zgj9s"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.447128 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.472365 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zgj9s"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.478031 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl7vc\" (UniqueName: \"kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc\") pod \"cinder-db-create-94qws\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.526753 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhplr\" (UniqueName: \"kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.526856 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.526926 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.526991 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxg9v\" (UniqueName: \"kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.528017 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.529324 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-94qws" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.554589 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e816-account-create-update-rw6jb"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.555788 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.560002 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.577822 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxg9v\" (UniqueName: \"kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v\") pod \"cinder-fc05-account-create-update-4c8xw\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.577880 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6bw48"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.578975 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.591516 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e816-account-create-update-rw6jb"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.608804 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6bw48"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.628180 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhplr\" (UniqueName: \"kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.628249 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.637544 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jncl\" (UniqueName: \"kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.637603 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.637685 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvgqt\" (UniqueName: \"kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.637836 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.638699 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.665182 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhplr\" (UniqueName: \"kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr\") pod \"barbican-db-create-zgj9s\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.693250 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ck59h"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.694398 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.696094 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.696289 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.696635 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-svnn6" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.698459 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.707117 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.707483 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ck59h"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.729222 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-171f-account-create-update-qzwkr"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.730537 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.734535 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743580 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8cvq\" (UniqueName: \"kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743654 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jncl\" (UniqueName: \"kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743696 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743735 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvgqt\" (UniqueName: \"kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743806 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743858 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.743892 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.744823 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.744852 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.760108 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-171f-account-create-update-qzwkr"] Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.764141 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvgqt\" (UniqueName: \"kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt\") pod \"barbican-e816-account-create-update-rw6jb\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.764414 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jncl\" (UniqueName: \"kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl\") pod \"neutron-db-create-6bw48\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.835287 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.835611 4856 generic.go:334] "Generic (PLEG): container finished" podID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerID="e4a69faaa223cfec399d7968d877f03a45b5b42cf94e4121e04f0b2dc26b0dfb" exitCode=0 Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.835649 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" event={"ID":"aced2eae-1557-490d-8808-ac1ee0761fb9","Type":"ContainerDied","Data":"e4a69faaa223cfec399d7968d877f03a45b5b42cf94e4121e04f0b2dc26b0dfb"} Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.835674 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" event={"ID":"aced2eae-1557-490d-8808-ac1ee0761fb9","Type":"ContainerStarted","Data":"4cb699b45f60af94d3c598281432089796b845fab914328dee9cc47dfed1412a"} Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.844795 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.844905 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8cvq\" (UniqueName: \"kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.844978 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.845030 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ndcb\" (UniqueName: \"kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.845083 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.848616 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.859340 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.865674 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8cvq\" (UniqueName: \"kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq\") pod \"keystone-db-sync-ck59h\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.946016 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ndcb\" (UniqueName: \"kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.946166 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.947935 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.966978 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ndcb\" (UniqueName: \"kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb\") pod \"neutron-171f-account-create-update-qzwkr\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:46 crc kubenswrapper[4856]: I0320 13:44:46.994826 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.020435 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.023828 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-94qws"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.032606 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.065521 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.281187 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fc05-account-create-update-4c8xw"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.286946 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:47 crc kubenswrapper[4856]: W0320 13:44:47.300335 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod119ab08d_caf7_497f_b11d_fb0d06a34600.slice/crio-333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8 WatchSource:0}: Error finding container 333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8: Status 404 returned error can't find the container with id 333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8 Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.352020 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle\") pod \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.352054 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swqf8\" (UniqueName: \"kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8\") pod \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.352082 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data\") pod \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.352106 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data\") pod \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\" (UID: \"8c34367e-1bb1-4e1d-8a11-190bca797f8e\") " Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.358166 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "8c34367e-1bb1-4e1d-8a11-190bca797f8e" (UID: "8c34367e-1bb1-4e1d-8a11-190bca797f8e"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.358937 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8" (OuterVolumeSpecName: "kube-api-access-swqf8") pod "8c34367e-1bb1-4e1d-8a11-190bca797f8e" (UID: "8c34367e-1bb1-4e1d-8a11-190bca797f8e"). InnerVolumeSpecName "kube-api-access-swqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.397933 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-zgj9s"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.398607 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c34367e-1bb1-4e1d-8a11-190bca797f8e" (UID: "8c34367e-1bb1-4e1d-8a11-190bca797f8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.414164 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data" (OuterVolumeSpecName: "config-data") pod "8c34367e-1bb1-4e1d-8a11-190bca797f8e" (UID: "8c34367e-1bb1-4e1d-8a11-190bca797f8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.455416 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.455461 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swqf8\" (UniqueName: \"kubernetes.io/projected/8c34367e-1bb1-4e1d-8a11-190bca797f8e-kube-api-access-swqf8\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.455480 4856 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.455493 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c34367e-1bb1-4e1d-8a11-190bca797f8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.653678 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e816-account-create-update-rw6jb"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.663622 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6bw48"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.673424 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ck59h"] Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.766162 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-171f-account-create-update-qzwkr"] Mar 20 13:44:47 crc kubenswrapper[4856]: W0320 13:44:47.815328 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c197353_35fa_478c_816b_c85320d3af70.slice/crio-2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f WatchSource:0}: Error finding container 2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f: Status 404 returned error can't find the container with id 2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.842875 4856 generic.go:334] "Generic (PLEG): container finished" podID="119ab08d-caf7-497f-b11d-fb0d06a34600" containerID="5d5613f916e03b4314b4133cf18da4fd036e61c334a7e2755e150637c92b7cd9" exitCode=0 Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.842931 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fc05-account-create-update-4c8xw" event={"ID":"119ab08d-caf7-497f-b11d-fb0d06a34600","Type":"ContainerDied","Data":"5d5613f916e03b4314b4133cf18da4fd036e61c334a7e2755e150637c92b7cd9"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.842953 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fc05-account-create-update-4c8xw" event={"ID":"119ab08d-caf7-497f-b11d-fb0d06a34600","Type":"ContainerStarted","Data":"333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.844789 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e816-account-create-update-rw6jb" event={"ID":"f46ba794-6fed-49d3-a0ce-1be8e5a623d4","Type":"ContainerStarted","Data":"f940666617a86f91036338a0a09e9bff77dc039a5188c4b40d60da7e0e9a4c78"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.846948 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" event={"ID":"aced2eae-1557-490d-8808-ac1ee0761fb9","Type":"ContainerStarted","Data":"dc1f04b0444878cf9bfead0cb47bc72cfedb5a6f2872fbcc2d61146577b113cc"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.847082 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.848010 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-171f-account-create-update-qzwkr" event={"ID":"6c197353-35fa-478c-816b-c85320d3af70","Type":"ContainerStarted","Data":"2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.851039 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-76pgk" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.851626 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-76pgk" event={"ID":"8c34367e-1bb1-4e1d-8a11-190bca797f8e","Type":"ContainerDied","Data":"b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.851652 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3085c441321de051e7d1cf38af6ba78a6071da4f7800ca59e623cfa5bc491f4" Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.853048 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zgj9s" event={"ID":"1f749db7-7069-4107-90ea-edfc4ea7dc7f","Type":"ContainerStarted","Data":"2a042cc4e28d2645cd41e9cac3dae6d1aedb80c69cbc119c7f670a0e6081ff5a"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.853074 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zgj9s" event={"ID":"1f749db7-7069-4107-90ea-edfc4ea7dc7f","Type":"ContainerStarted","Data":"f066a5756d9e53997ad10e000d0e99d5c56aff64fdf8d23d6240c50bf4373ce3"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.862414 4856 generic.go:334] "Generic (PLEG): container finished" podID="e928ca03-85a4-4e75-bfc2-6752d35d34ab" containerID="e665ed6ce24fbf7ef1d1d05407ed507322019dcf1a1f8c42525de5d8898eb0e4" exitCode=0 Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.862571 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-94qws" event={"ID":"e928ca03-85a4-4e75-bfc2-6752d35d34ab","Type":"ContainerDied","Data":"e665ed6ce24fbf7ef1d1d05407ed507322019dcf1a1f8c42525de5d8898eb0e4"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.862624 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-94qws" event={"ID":"e928ca03-85a4-4e75-bfc2-6752d35d34ab","Type":"ContainerStarted","Data":"9c118a165089f15b470df7d118088b3d41703fe5ca1bcb6182645b31182d75ae"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.883476 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bw48" event={"ID":"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4","Type":"ContainerStarted","Data":"ac4192b0b41ffb2b11aa5b2ff31053e3f56e260bc1d740fe98ccc5a26aa1b258"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.885923 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ck59h" event={"ID":"00641aa4-4bea-4510-b7a7-a5e1c022340a","Type":"ContainerStarted","Data":"c30a83b9aa01d9ca092b49fed8bc053369c9dcfda47cc6fec3f13a6d5c6925d1"} Mar 20 13:44:47 crc kubenswrapper[4856]: I0320 13:44:47.897852 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" podStartSLOduration=2.897837146 podStartE2EDuration="2.897837146s" podCreationTimestamp="2026-03-20 13:44:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:47.882765781 +0000 UTC m=+1302.763791911" watchObservedRunningTime="2026-03-20 13:44:47.897837146 +0000 UTC m=+1302.778863276" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.165127 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.194698 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:44:48 crc kubenswrapper[4856]: E0320 13:44:48.195144 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c34367e-1bb1-4e1d-8a11-190bca797f8e" containerName="glance-db-sync" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.195166 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c34367e-1bb1-4e1d-8a11-190bca797f8e" containerName="glance-db-sync" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.195421 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c34367e-1bb1-4e1d-8a11-190bca797f8e" containerName="glance-db-sync" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.196505 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.204887 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.269899 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.269972 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.270011 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwg27\" (UniqueName: \"kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.270047 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.270075 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.270097 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371374 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371427 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371448 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwg27\" (UniqueName: \"kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371482 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371510 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.371535 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.372230 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.373249 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.373709 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.373901 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.373928 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.391602 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwg27\" (UniqueName: \"kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27\") pod \"dnsmasq-dns-74f6bcbc87-kvcg6\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.512550 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.898501 4856 generic.go:334] "Generic (PLEG): container finished" podID="ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" containerID="84d846a4b23272139a017e7b80a305138eb8dcb2d4719bcd2bb1f9b05fa1a53d" exitCode=0 Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.898675 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bw48" event={"ID":"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4","Type":"ContainerDied","Data":"84d846a4b23272139a017e7b80a305138eb8dcb2d4719bcd2bb1f9b05fa1a53d"} Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.902753 4856 generic.go:334] "Generic (PLEG): container finished" podID="1f749db7-7069-4107-90ea-edfc4ea7dc7f" containerID="2a042cc4e28d2645cd41e9cac3dae6d1aedb80c69cbc119c7f670a0e6081ff5a" exitCode=0 Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.902831 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zgj9s" event={"ID":"1f749db7-7069-4107-90ea-edfc4ea7dc7f","Type":"ContainerDied","Data":"2a042cc4e28d2645cd41e9cac3dae6d1aedb80c69cbc119c7f670a0e6081ff5a"} Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.906585 4856 generic.go:334] "Generic (PLEG): container finished" podID="f46ba794-6fed-49d3-a0ce-1be8e5a623d4" containerID="f6981b172eccdb9553125740236321d68c8c0731d63afc46cf2040e981ea85b8" exitCode=0 Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.906875 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e816-account-create-update-rw6jb" event={"ID":"f46ba794-6fed-49d3-a0ce-1be8e5a623d4","Type":"ContainerDied","Data":"f6981b172eccdb9553125740236321d68c8c0731d63afc46cf2040e981ea85b8"} Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.908105 4856 generic.go:334] "Generic (PLEG): container finished" podID="6c197353-35fa-478c-816b-c85320d3af70" containerID="064a0bd3d163eca864105c246a50becc9a7b5a97a2cb898d52c76240f0bad351" exitCode=0 Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.908579 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-171f-account-create-update-qzwkr" event={"ID":"6c197353-35fa-478c-816b-c85320d3af70","Type":"ContainerDied","Data":"064a0bd3d163eca864105c246a50becc9a7b5a97a2cb898d52c76240f0bad351"} Mar 20 13:44:48 crc kubenswrapper[4856]: I0320 13:44:48.979231 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:44:49 crc kubenswrapper[4856]: W0320 13:44:49.003777 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36374778_f2e8_453d_81bc_b76216ab56b3.slice/crio-3ab7adf506da3ffbbb084edf5725d2ae79d46e27c9ec961d9afbc8c89add1778 WatchSource:0}: Error finding container 3ab7adf506da3ffbbb084edf5725d2ae79d46e27c9ec961d9afbc8c89add1778: Status 404 returned error can't find the container with id 3ab7adf506da3ffbbb084edf5725d2ae79d46e27c9ec961d9afbc8c89add1778 Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.243957 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.285806 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhplr\" (UniqueName: \"kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr\") pod \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.285915 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts\") pod \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\" (UID: \"1f749db7-7069-4107-90ea-edfc4ea7dc7f\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.286877 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1f749db7-7069-4107-90ea-edfc4ea7dc7f" (UID: "1f749db7-7069-4107-90ea-edfc4ea7dc7f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.289983 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr" (OuterVolumeSpecName: "kube-api-access-dhplr") pod "1f749db7-7069-4107-90ea-edfc4ea7dc7f" (UID: "1f749db7-7069-4107-90ea-edfc4ea7dc7f"). InnerVolumeSpecName "kube-api-access-dhplr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.330261 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.346483 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-94qws" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388118 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts\") pod \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388184 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxg9v\" (UniqueName: \"kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v\") pod \"119ab08d-caf7-497f-b11d-fb0d06a34600\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388341 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts\") pod \"119ab08d-caf7-497f-b11d-fb0d06a34600\" (UID: \"119ab08d-caf7-497f-b11d-fb0d06a34600\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388368 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cl7vc\" (UniqueName: \"kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc\") pod \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\" (UID: \"e928ca03-85a4-4e75-bfc2-6752d35d34ab\") " Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388796 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhplr\" (UniqueName: \"kubernetes.io/projected/1f749db7-7069-4107-90ea-edfc4ea7dc7f-kube-api-access-dhplr\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388820 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1f749db7-7069-4107-90ea-edfc4ea7dc7f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.388925 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e928ca03-85a4-4e75-bfc2-6752d35d34ab" (UID: "e928ca03-85a4-4e75-bfc2-6752d35d34ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.389482 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "119ab08d-caf7-497f-b11d-fb0d06a34600" (UID: "119ab08d-caf7-497f-b11d-fb0d06a34600"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.391898 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v" (OuterVolumeSpecName: "kube-api-access-gxg9v") pod "119ab08d-caf7-497f-b11d-fb0d06a34600" (UID: "119ab08d-caf7-497f-b11d-fb0d06a34600"). InnerVolumeSpecName "kube-api-access-gxg9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.393500 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc" (OuterVolumeSpecName: "kube-api-access-cl7vc") pod "e928ca03-85a4-4e75-bfc2-6752d35d34ab" (UID: "e928ca03-85a4-4e75-bfc2-6752d35d34ab"). InnerVolumeSpecName "kube-api-access-cl7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.490412 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/119ab08d-caf7-497f-b11d-fb0d06a34600-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.490445 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cl7vc\" (UniqueName: \"kubernetes.io/projected/e928ca03-85a4-4e75-bfc2-6752d35d34ab-kube-api-access-cl7vc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.490457 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e928ca03-85a4-4e75-bfc2-6752d35d34ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.490465 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxg9v\" (UniqueName: \"kubernetes.io/projected/119ab08d-caf7-497f-b11d-fb0d06a34600-kube-api-access-gxg9v\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.920797 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fc05-account-create-update-4c8xw" event={"ID":"119ab08d-caf7-497f-b11d-fb0d06a34600","Type":"ContainerDied","Data":"333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8"} Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.921122 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="333b9e393aa816b84cf1d203f4a903aa38c86b9441bf985fa7b96c7683eb91e8" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.920850 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fc05-account-create-update-4c8xw" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.922169 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-94qws" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.923187 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-94qws" event={"ID":"e928ca03-85a4-4e75-bfc2-6752d35d34ab","Type":"ContainerDied","Data":"9c118a165089f15b470df7d118088b3d41703fe5ca1bcb6182645b31182d75ae"} Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.923488 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c118a165089f15b470df7d118088b3d41703fe5ca1bcb6182645b31182d75ae" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.924500 4856 generic.go:334] "Generic (PLEG): container finished" podID="36374778-f2e8-453d-81bc-b76216ab56b3" containerID="9314c73d5a46200720ca35902f50b8e44d58eaf3110c01669944e1e90e9ae266" exitCode=0 Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.924557 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" event={"ID":"36374778-f2e8-453d-81bc-b76216ab56b3","Type":"ContainerDied","Data":"9314c73d5a46200720ca35902f50b8e44d58eaf3110c01669944e1e90e9ae266"} Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.924580 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" event={"ID":"36374778-f2e8-453d-81bc-b76216ab56b3","Type":"ContainerStarted","Data":"3ab7adf506da3ffbbb084edf5725d2ae79d46e27c9ec961d9afbc8c89add1778"} Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.928340 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-zgj9s" event={"ID":"1f749db7-7069-4107-90ea-edfc4ea7dc7f","Type":"ContainerDied","Data":"f066a5756d9e53997ad10e000d0e99d5c56aff64fdf8d23d6240c50bf4373ce3"} Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.928363 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f066a5756d9e53997ad10e000d0e99d5c56aff64fdf8d23d6240c50bf4373ce3" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.928495 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-zgj9s" Mar 20 13:44:49 crc kubenswrapper[4856]: I0320 13:44:49.928710 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="dnsmasq-dns" containerID="cri-o://dc1f04b0444878cf9bfead0cb47bc72cfedb5a6f2872fbcc2d61146577b113cc" gracePeriod=10 Mar 20 13:44:50 crc kubenswrapper[4856]: I0320 13:44:50.941004 4856 generic.go:334] "Generic (PLEG): container finished" podID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerID="dc1f04b0444878cf9bfead0cb47bc72cfedb5a6f2872fbcc2d61146577b113cc" exitCode=0 Mar 20 13:44:50 crc kubenswrapper[4856]: I0320 13:44:50.941089 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" event={"ID":"aced2eae-1557-490d-8808-ac1ee0761fb9","Type":"ContainerDied","Data":"dc1f04b0444878cf9bfead0cb47bc72cfedb5a6f2872fbcc2d61146577b113cc"} Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.804307 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.810775 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.821294 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.852950 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jncl\" (UniqueName: \"kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl\") pod \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.853017 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvgqt\" (UniqueName: \"kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt\") pod \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.853154 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ndcb\" (UniqueName: \"kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb\") pod \"6c197353-35fa-478c-816b-c85320d3af70\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.853190 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts\") pod \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\" (UID: \"f46ba794-6fed-49d3-a0ce-1be8e5a623d4\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.853230 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts\") pod \"6c197353-35fa-478c-816b-c85320d3af70\" (UID: \"6c197353-35fa-478c-816b-c85320d3af70\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.853366 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts\") pod \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\" (UID: \"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4\") " Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.857170 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c197353-35fa-478c-816b-c85320d3af70" (UID: "6c197353-35fa-478c-816b-c85320d3af70"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.862731 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" (UID: "ac73110f-ec10-4b3b-9a7c-02a43ce9cef4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.863036 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f46ba794-6fed-49d3-a0ce-1be8e5a623d4" (UID: "f46ba794-6fed-49d3-a0ce-1be8e5a623d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.864221 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl" (OuterVolumeSpecName: "kube-api-access-9jncl") pod "ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" (UID: "ac73110f-ec10-4b3b-9a7c-02a43ce9cef4"). InnerVolumeSpecName "kube-api-access-9jncl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.866718 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb" (OuterVolumeSpecName: "kube-api-access-5ndcb") pod "6c197353-35fa-478c-816b-c85320d3af70" (UID: "6c197353-35fa-478c-816b-c85320d3af70"). InnerVolumeSpecName "kube-api-access-5ndcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.867206 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt" (OuterVolumeSpecName: "kube-api-access-zvgqt") pod "f46ba794-6fed-49d3-a0ce-1be8e5a623d4" (UID: "f46ba794-6fed-49d3-a0ce-1be8e5a623d4"). InnerVolumeSpecName "kube-api-access-zvgqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955220 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ndcb\" (UniqueName: \"kubernetes.io/projected/6c197353-35fa-478c-816b-c85320d3af70-kube-api-access-5ndcb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955424 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955436 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c197353-35fa-478c-816b-c85320d3af70-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955444 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955453 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jncl\" (UniqueName: \"kubernetes.io/projected/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4-kube-api-access-9jncl\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.955463 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvgqt\" (UniqueName: \"kubernetes.io/projected/f46ba794-6fed-49d3-a0ce-1be8e5a623d4-kube-api-access-zvgqt\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.968734 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e816-account-create-update-rw6jb" event={"ID":"f46ba794-6fed-49d3-a0ce-1be8e5a623d4","Type":"ContainerDied","Data":"f940666617a86f91036338a0a09e9bff77dc039a5188c4b40d60da7e0e9a4c78"} Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.968771 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f940666617a86f91036338a0a09e9bff77dc039a5188c4b40d60da7e0e9a4c78" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.968902 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-rw6jb" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.970819 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-171f-account-create-update-qzwkr" event={"ID":"6c197353-35fa-478c-816b-c85320d3af70","Type":"ContainerDied","Data":"2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f"} Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.970871 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a92fa5338758904fb013fdfec0f990c220cb0a66d128977ea19e647680e619f" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.970921 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-171f-account-create-update-qzwkr" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.979637 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6bw48" event={"ID":"ac73110f-ec10-4b3b-9a7c-02a43ce9cef4","Type":"ContainerDied","Data":"ac4192b0b41ffb2b11aa5b2ff31053e3f56e260bc1d740fe98ccc5a26aa1b258"} Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.979684 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac4192b0b41ffb2b11aa5b2ff31053e3f56e260bc1d740fe98ccc5a26aa1b258" Mar 20 13:44:52 crc kubenswrapper[4856]: I0320 13:44:52.979747 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6bw48" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.061592 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157547 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157598 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157669 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157690 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157807 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdxsc\" (UniqueName: \"kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.157841 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config\") pod \"aced2eae-1557-490d-8808-ac1ee0761fb9\" (UID: \"aced2eae-1557-490d-8808-ac1ee0761fb9\") " Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.163069 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc" (OuterVolumeSpecName: "kube-api-access-pdxsc") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "kube-api-access-pdxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: E0320 13:44:53.192898 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c197353_35fa_478c_816b_c85320d3af70.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46ba794_6fed_49d3_a0ce_1be8e5a623d4.slice/crio-f940666617a86f91036338a0a09e9bff77dc039a5188c4b40d60da7e0e9a4c78\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf46ba794_6fed_49d3_a0ce_1be8e5a623d4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac73110f_ec10_4b3b_9a7c_02a43ce9cef4.slice/crio-ac4192b0b41ffb2b11aa5b2ff31053e3f56e260bc1d740fe98ccc5a26aa1b258\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac73110f_ec10_4b3b_9a7c_02a43ce9cef4.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.200103 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.201179 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config" (OuterVolumeSpecName: "config") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.203822 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.205718 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.211264 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aced2eae-1557-490d-8808-ac1ee0761fb9" (UID: "aced2eae-1557-490d-8808-ac1ee0761fb9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259223 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259265 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259297 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259308 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259319 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aced2eae-1557-490d-8808-ac1ee0761fb9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.259328 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdxsc\" (UniqueName: \"kubernetes.io/projected/aced2eae-1557-490d-8808-ac1ee0761fb9-kube-api-access-pdxsc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.996816 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.996826 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-tj65h" event={"ID":"aced2eae-1557-490d-8808-ac1ee0761fb9","Type":"ContainerDied","Data":"4cb699b45f60af94d3c598281432089796b845fab914328dee9cc47dfed1412a"} Mar 20 13:44:53 crc kubenswrapper[4856]: I0320 13:44:53.997152 4856 scope.go:117] "RemoveContainer" containerID="dc1f04b0444878cf9bfead0cb47bc72cfedb5a6f2872fbcc2d61146577b113cc" Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.001308 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" event={"ID":"36374778-f2e8-453d-81bc-b76216ab56b3","Type":"ContainerStarted","Data":"2dededf8886bfe05a51f84f7cf8f5ea61754b376513242d1d77f5adce3e9fe2a"} Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.001616 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.004571 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ck59h" event={"ID":"00641aa4-4bea-4510-b7a7-a5e1c022340a","Type":"ContainerStarted","Data":"4fe885cdd4aa1c3051a53c27c91141ee996859bc09dc6cddf5ba8e626d811008"} Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.022054 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.026476 4856 scope.go:117] "RemoveContainer" containerID="e4a69faaa223cfec399d7968d877f03a45b5b42cf94e4121e04f0b2dc26b0dfb" Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.028445 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-tj65h"] Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.045827 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" podStartSLOduration=6.045812604 podStartE2EDuration="6.045812604s" podCreationTimestamp="2026-03-20 13:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:54.041718191 +0000 UTC m=+1308.922744331" watchObservedRunningTime="2026-03-20 13:44:54.045812604 +0000 UTC m=+1308.926838734" Mar 20 13:44:54 crc kubenswrapper[4856]: I0320 13:44:54.070418 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ck59h" podStartSLOduration=2.9011976969999997 podStartE2EDuration="8.070396601s" podCreationTimestamp="2026-03-20 13:44:46 +0000 UTC" firstStartedPulling="2026-03-20 13:44:47.649386674 +0000 UTC m=+1302.530412804" lastFinishedPulling="2026-03-20 13:44:52.818585578 +0000 UTC m=+1307.699611708" observedRunningTime="2026-03-20 13:44:54.057769184 +0000 UTC m=+1308.938795324" watchObservedRunningTime="2026-03-20 13:44:54.070396601 +0000 UTC m=+1308.951422731" Mar 20 13:44:55 crc kubenswrapper[4856]: I0320 13:44:55.835936 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" path="/var/lib/kubelet/pods/aced2eae-1557-490d-8808-ac1ee0761fb9/volumes" Mar 20 13:44:56 crc kubenswrapper[4856]: I0320 13:44:56.025432 4856 generic.go:334] "Generic (PLEG): container finished" podID="00641aa4-4bea-4510-b7a7-a5e1c022340a" containerID="4fe885cdd4aa1c3051a53c27c91141ee996859bc09dc6cddf5ba8e626d811008" exitCode=0 Mar 20 13:44:56 crc kubenswrapper[4856]: I0320 13:44:56.025476 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ck59h" event={"ID":"00641aa4-4bea-4510-b7a7-a5e1c022340a","Type":"ContainerDied","Data":"4fe885cdd4aa1c3051a53c27c91141ee996859bc09dc6cddf5ba8e626d811008"} Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.344600 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.434183 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle\") pod \"00641aa4-4bea-4510-b7a7-a5e1c022340a\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.434350 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data\") pod \"00641aa4-4bea-4510-b7a7-a5e1c022340a\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.434448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8cvq\" (UniqueName: \"kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq\") pod \"00641aa4-4bea-4510-b7a7-a5e1c022340a\" (UID: \"00641aa4-4bea-4510-b7a7-a5e1c022340a\") " Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.457127 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq" (OuterVolumeSpecName: "kube-api-access-k8cvq") pod "00641aa4-4bea-4510-b7a7-a5e1c022340a" (UID: "00641aa4-4bea-4510-b7a7-a5e1c022340a"). InnerVolumeSpecName "kube-api-access-k8cvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.463661 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00641aa4-4bea-4510-b7a7-a5e1c022340a" (UID: "00641aa4-4bea-4510-b7a7-a5e1c022340a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.496999 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data" (OuterVolumeSpecName: "config-data") pod "00641aa4-4bea-4510-b7a7-a5e1c022340a" (UID: "00641aa4-4bea-4510-b7a7-a5e1c022340a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.536448 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.536484 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00641aa4-4bea-4510-b7a7-a5e1c022340a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:57 crc kubenswrapper[4856]: I0320 13:44:57.536497 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8cvq\" (UniqueName: \"kubernetes.io/projected/00641aa4-4bea-4510-b7a7-a5e1c022340a-kube-api-access-k8cvq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.044321 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ck59h" event={"ID":"00641aa4-4bea-4510-b7a7-a5e1c022340a","Type":"ContainerDied","Data":"c30a83b9aa01d9ca092b49fed8bc053369c9dcfda47cc6fec3f13a6d5c6925d1"} Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.044580 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c30a83b9aa01d9ca092b49fed8bc053369c9dcfda47cc6fec3f13a6d5c6925d1" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.044388 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ck59h" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.515477 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.591120 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.591477 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-nvzdt" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="dnsmasq-dns" containerID="cri-o://8b9eb612881133f8106b807c3fc767f1a342062e5330fcf41145845996fb5dd0" gracePeriod=10 Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.626956 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.628899 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f749db7-7069-4107-90ea-edfc4ea7dc7f" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.628925 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f749db7-7069-4107-90ea-edfc4ea7dc7f" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.628937 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.628946 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.628963 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46ba794-6fed-49d3-a0ce-1be8e5a623d4" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.628971 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46ba794-6fed-49d3-a0ce-1be8e5a623d4" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.628984 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e928ca03-85a4-4e75-bfc2-6752d35d34ab" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.628992 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e928ca03-85a4-4e75-bfc2-6752d35d34ab" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.629016 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c197353-35fa-478c-816b-c85320d3af70" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629024 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c197353-35fa-478c-816b-c85320d3af70" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.629051 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="init" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629058 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="init" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.629071 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119ab08d-caf7-497f-b11d-fb0d06a34600" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629079 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="119ab08d-caf7-497f-b11d-fb0d06a34600" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.629107 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="dnsmasq-dns" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629115 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="dnsmasq-dns" Mar 20 13:44:58 crc kubenswrapper[4856]: E0320 13:44:58.629133 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00641aa4-4bea-4510-b7a7-a5e1c022340a" containerName="keystone-db-sync" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629140 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="00641aa4-4bea-4510-b7a7-a5e1c022340a" containerName="keystone-db-sync" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629367 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="aced2eae-1557-490d-8808-ac1ee0761fb9" containerName="dnsmasq-dns" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629386 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46ba794-6fed-49d3-a0ce-1be8e5a623d4" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629395 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e928ca03-85a4-4e75-bfc2-6752d35d34ab" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629403 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f749db7-7069-4107-90ea-edfc4ea7dc7f" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629415 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" containerName="mariadb-database-create" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629425 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="00641aa4-4bea-4510-b7a7-a5e1c022340a" containerName="keystone-db-sync" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629443 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c197353-35fa-478c-816b-c85320d3af70" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.629458 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="119ab08d-caf7-497f-b11d-fb0d06a34600" containerName="mariadb-account-create-update" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.630680 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653373 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5h7c\" (UniqueName: \"kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653454 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653492 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653536 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653595 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.653629 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.654395 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.669820 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-lxqp7"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.670924 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.676336 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-svnn6" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.678980 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.682247 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.682491 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.682623 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.704799 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lxqp7"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757117 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757420 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757446 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757469 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757488 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757507 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757537 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g2b\" (UniqueName: \"kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757564 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757588 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757607 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757651 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.757673 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5h7c\" (UniqueName: \"kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.758709 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.759210 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.759704 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.764059 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.767678 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.815251 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5h7c\" (UniqueName: \"kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c\") pod \"dnsmasq-dns-847c4cc679-z2rz7\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.858809 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g2b\" (UniqueName: \"kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.858873 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.858928 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.858961 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.859018 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.859040 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.869516 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.872221 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.874118 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.899913 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.908439 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.912798 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g2b\" (UniqueName: \"kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b\") pod \"keystone-bootstrap-lxqp7\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.940409 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.942301 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.949278 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.949579 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.956490 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sllh9"] Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.957473 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.960720 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.960913 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sgh42" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.961308 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962165 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmgz\" (UniqueName: \"kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962198 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962252 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962299 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962328 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962359 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.962374 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:58 crc kubenswrapper[4856]: I0320 13:44:58.970448 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.023539 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sllh9"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.024830 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.035169 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fzpqw"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.036805 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.041011 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.042833 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cj7cv" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.042977 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.050283 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068654 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068721 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8dqm\" (UniqueName: \"kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068745 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068780 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068806 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068832 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068854 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ks48\" (UniqueName: \"kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068876 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068903 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068934 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068953 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.068996 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.069022 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmgz\" (UniqueName: \"kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.069046 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.074468 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.077760 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.078315 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.078677 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.079548 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.086730 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.089258 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.089457 4856 generic.go:334] "Generic (PLEG): container finished" podID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerID="8b9eb612881133f8106b807c3fc767f1a342062e5330fcf41145845996fb5dd0" exitCode=0 Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.089570 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-nvzdt" event={"ID":"b55fcdcb-f03a-448e-9fc1-a8d04504b935","Type":"ContainerDied","Data":"8b9eb612881133f8106b807c3fc767f1a342062e5330fcf41145845996fb5dd0"} Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.116135 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.116999 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fzpqw"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.119233 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmgz\" (UniqueName: \"kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz\") pod \"ceilometer-0\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.142185 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-52xvf"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.157412 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.157540 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.165022 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.165442 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vfhl9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.168740 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-n4czn"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.169320 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.174882 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176479 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8dqm\" (UniqueName: \"kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176530 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176559 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176583 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ks48\" (UniqueName: \"kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176602 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176652 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176685 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.176705 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.177035 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.183526 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.186625 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-52xvf"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.192813 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.193450 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tndcb" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.194868 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.197370 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.203962 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.205317 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.207106 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.208099 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8dqm\" (UniqueName: \"kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm\") pod \"neutron-db-sync-sllh9\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.212475 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n4czn"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.231794 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.236412 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ks48\" (UniqueName: \"kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48\") pod \"placement-db-sync-fzpqw\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.284423 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml4l6\" (UniqueName: \"kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.284587 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.284693 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.284807 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrqt\" (UniqueName: \"kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.284959 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285086 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpbtj\" (UniqueName: \"kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285211 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285375 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285471 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285575 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285677 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285764 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285872 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.285961 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.286050 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.326713 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.343933 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.376746 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sllh9" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387028 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc\") pod \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387162 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9chgq\" (UniqueName: \"kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq\") pod \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387295 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb\") pod \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387330 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config\") pod \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387357 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb\") pod \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\" (UID: \"b55fcdcb-f03a-448e-9fc1-a8d04504b935\") " Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387630 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387668 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387692 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387723 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387755 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387784 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387819 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387847 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387874 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387941 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml4l6\" (UniqueName: \"kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387971 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.387998 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.388042 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrqt\" (UniqueName: \"kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.388072 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.388126 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpbtj\" (UniqueName: \"kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.391593 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.391665 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.398589 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.400806 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.401525 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.402434 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.403044 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.403366 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.403993 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.404404 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.405582 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.408424 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.414555 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpbtj\" (UniqueName: \"kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj\") pod \"cinder-db-sync-n4czn\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.426267 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq" (OuterVolumeSpecName: "kube-api-access-9chgq") pod "b55fcdcb-f03a-448e-9fc1-a8d04504b935" (UID: "b55fcdcb-f03a-448e-9fc1-a8d04504b935"). InnerVolumeSpecName "kube-api-access-9chgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.429739 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml4l6\" (UniqueName: \"kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6\") pod \"barbican-db-sync-52xvf\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.434923 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrqt\" (UniqueName: \"kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt\") pod \"dnsmasq-dns-785d8bcb8c-2bz5n\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.463012 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fzpqw" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.490575 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9chgq\" (UniqueName: \"kubernetes.io/projected/b55fcdcb-f03a-448e-9fc1-a8d04504b935-kube-api-access-9chgq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.493175 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-52xvf" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.510091 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config" (OuterVolumeSpecName: "config") pod "b55fcdcb-f03a-448e-9fc1-a8d04504b935" (UID: "b55fcdcb-f03a-448e-9fc1-a8d04504b935"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.519206 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b55fcdcb-f03a-448e-9fc1-a8d04504b935" (UID: "b55fcdcb-f03a-448e-9fc1-a8d04504b935"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.521152 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b55fcdcb-f03a-448e-9fc1-a8d04504b935" (UID: "b55fcdcb-f03a-448e-9fc1-a8d04504b935"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.521823 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b55fcdcb-f03a-448e-9fc1-a8d04504b935" (UID: "b55fcdcb-f03a-448e-9fc1-a8d04504b935"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.533348 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.548378 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n4czn" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.591680 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.591718 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.591732 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.591742 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b55fcdcb-f03a-448e-9fc1-a8d04504b935-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.800397 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:44:59 crc kubenswrapper[4856]: W0320 13:44:59.801978 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bd0eef5_69f4_4bc9_9704_613d1828e300.slice/crio-d88ef3439c5e6ebb6b9b2027454c30a7cbfedc703817af2fbb25a2c02659eb7f WatchSource:0}: Error finding container d88ef3439c5e6ebb6b9b2027454c30a7cbfedc703817af2fbb25a2c02659eb7f: Status 404 returned error can't find the container with id d88ef3439c5e6ebb6b9b2027454c30a7cbfedc703817af2fbb25a2c02659eb7f Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.808350 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-lxqp7"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.816110 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: E0320 13:44:59.816878 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="dnsmasq-dns" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.816946 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="dnsmasq-dns" Mar 20 13:44:59 crc kubenswrapper[4856]: E0320 13:44:59.817005 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="init" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.817068 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="init" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.817328 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" containerName="dnsmasq-dns" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.818369 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.821930 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.822083 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.822153 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-6xjtb" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.822537 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.859462 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.903074 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904339 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904623 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904706 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904782 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904886 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh2xj\" (UniqueName: \"kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.904979 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.905080 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.927201 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.928842 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.933508 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.933975 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.966422 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: W0320 13:44:59.969204 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcda26562_2768_4d20_8aec_342cb3bf6b8c.slice/crio-79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78 WatchSource:0}: Error finding container 79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78: Status 404 returned error can't find the container with id 79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78 Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.975430 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:44:59 crc kubenswrapper[4856]: I0320 13:44:59.988643 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sllh9"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.010974 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.011052 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012129 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh2xj\" (UniqueName: \"kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012181 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012210 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012271 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012328 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012364 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012438 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012509 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012586 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012672 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012696 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012710 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nx5j\" (UniqueName: \"kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012805 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.012827 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.014745 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.014882 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.017068 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.027448 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.028884 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.030494 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.034207 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.043677 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh2xj\" (UniqueName: \"kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.091977 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.103012 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-nvzdt" event={"ID":"b55fcdcb-f03a-448e-9fc1-a8d04504b935","Type":"ContainerDied","Data":"de5a575a0778df092f923395ccba36e60bcfd6422a8f6d823a033760878d1a56"} Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.103088 4856 scope.go:117] "RemoveContainer" containerID="8b9eb612881133f8106b807c3fc767f1a342062e5330fcf41145845996fb5dd0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.104673 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-nvzdt" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.107892 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lxqp7" event={"ID":"ed07030a-fbbf-4b61-9aa0-910b9c4ae087","Type":"ContainerStarted","Data":"8e17700b3b3ec3b2d20d515c791ed4488721299008f7ab6abbe2db0778f5ce0d"} Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.118867 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.118969 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.118991 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.119064 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.119116 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.119135 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nx5j\" (UniqueName: \"kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.119210 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.120320 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.120563 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.121298 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerStarted","Data":"79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78"} Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.125511 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.132733 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.134123 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" event={"ID":"7bd0eef5-69f4-4bc9-9704-613d1828e300","Type":"ContainerStarted","Data":"d88ef3439c5e6ebb6b9b2027454c30a7cbfedc703817af2fbb25a2c02659eb7f"} Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.138028 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.138653 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.142758 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.145537 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.146433 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sllh9" event={"ID":"4e59b689-e9d4-460b-8a82-50770f4d4422","Type":"ContainerStarted","Data":"0d1247cab4472b76ce7c1dc7cdce339226fad1fc69f7b84f2a7e9e7fd4737c60"} Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.158528 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.161780 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nx5j\" (UniqueName: \"kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.167827 4856 scope.go:117] "RemoveContainer" containerID="371ed1a7c848633219870522eddd5617212429dd771030b4f04bbf70e1fd6de0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.206913 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.208421 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.210893 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.211172 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.211232 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.227203 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-nvzdt"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.236994 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.247462 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fzpqw"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.301202 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-n4czn"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.301816 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: W0320 13:45:00.307869 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a11a777_2932_4a56_898d_2de11472cbc9.slice/crio-b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5 WatchSource:0}: Error finding container b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5: Status 404 returned error can't find the container with id b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5 Mar 20 13:45:00 crc kubenswrapper[4856]: W0320 13:45:00.314090 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a15bbae_4b61_484e_a95f_e5de1b17650b.slice/crio-67d9c9c10dcbbb8c5b2905f5fad89995dd7a9cdc31aa39dcdac06e229b24acaa WatchSource:0}: Error finding container 67d9c9c10dcbbb8c5b2905f5fad89995dd7a9cdc31aa39dcdac06e229b24acaa: Status 404 returned error can't find the container with id 67d9c9c10dcbbb8c5b2905f5fad89995dd7a9cdc31aa39dcdac06e229b24acaa Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.317984 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.318540 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.328243 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-52xvf"] Mar 20 13:45:00 crc kubenswrapper[4856]: W0320 13:45:00.334777 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode39393cf_dda0_4755_8e66_fc571afa2a1a.slice/crio-ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0 WatchSource:0}: Error finding container ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0: Status 404 returned error can't find the container with id ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0 Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.347746 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.347816 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmrj5\" (UniqueName: \"kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.347867 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.449749 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmrj5\" (UniqueName: \"kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.449819 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.449923 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.450754 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.454590 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.476812 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmrj5\" (UniqueName: \"kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5\") pod \"collect-profiles-29566905-8hzb8\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.527827 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:00 crc kubenswrapper[4856]: I0320 13:45:00.925631 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.077296 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.117918 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.225525 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerStarted","Data":"e07b9ef1341a51e41454f94912d78f3b1953a31bf9d6bd1566499cb6c7b5524a"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.252183 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" event={"ID":"0a15bbae-4b61-484e-a95f-e5de1b17650b","Type":"ContainerStarted","Data":"67d9c9c10dcbbb8c5b2905f5fad89995dd7a9cdc31aa39dcdac06e229b24acaa"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.280978 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n4czn" event={"ID":"0a11a777-2932-4a56-898d-2de11472cbc9","Type":"ContainerStarted","Data":"b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.298779 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" event={"ID":"2ab70ac0-2902-4f84-9142-060f5adee35b","Type":"ContainerStarted","Data":"0b923184d3eff50c712c2311572dad98f915404e08a7f1421d310719da2f4e0b"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.301792 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lxqp7" event={"ID":"ed07030a-fbbf-4b61-9aa0-910b9c4ae087","Type":"ContainerStarted","Data":"02776c7a86559edcb0633594c636cd3cd3d9717439545e3e679b74290bdecf55"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.302943 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-52xvf" event={"ID":"e39393cf-dda0-4755-8e66-fc571afa2a1a","Type":"ContainerStarted","Data":"ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.308457 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fzpqw" event={"ID":"394ec9f9-f47c-4f12-af34-26a3953f7668","Type":"ContainerStarted","Data":"4ced0aafb50dd0080bfad15ce272a7e717951538844fa499140640cb53be21dd"} Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.378978 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.495838 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.508216 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:01 crc kubenswrapper[4856]: I0320 13:45:01.830414 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55fcdcb-f03a-448e-9fc1-a8d04504b935" path="/var/lib/kubelet/pods/b55fcdcb-f03a-448e-9fc1-a8d04504b935/volumes" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.320950 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sllh9" event={"ID":"4e59b689-e9d4-460b-8a82-50770f4d4422","Type":"ContainerStarted","Data":"a366776a203a7fa9d5f08eb4671d855fe2eeb0585c1540357869f4722b8099e0"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.324790 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerStarted","Data":"f971831406407c61bf8f138c977f1dfe4dc36f8092931a2e112ee7d9776debed"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.326828 4856 generic.go:334] "Generic (PLEG): container finished" podID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerID="c99856c3c75f033c60969b18882bfc9a84ca71e63f81c5ca1ae48fe708661ec7" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.326893 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" event={"ID":"0a15bbae-4b61-484e-a95f-e5de1b17650b","Type":"ContainerDied","Data":"c99856c3c75f033c60969b18882bfc9a84ca71e63f81c5ca1ae48fe708661ec7"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.329128 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" event={"ID":"2ab70ac0-2902-4f84-9142-060f5adee35b","Type":"ContainerStarted","Data":"4b97f3e939ad017e8453b293bc48ae202eb064df16e3bf1b2a34e44913f6d7c5"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.335618 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerStarted","Data":"e2fcb51573f07e7a30bcb5c1d9de8ea066a66b00ca7f2e7f2ea352ab8fc44e21"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.367317 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sllh9" podStartSLOduration=4.367291488 podStartE2EDuration="4.367291488s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:02.339719279 +0000 UTC m=+1317.220745439" watchObservedRunningTime="2026-03-20 13:45:02.367291488 +0000 UTC m=+1317.248317638" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.369184 4856 generic.go:334] "Generic (PLEG): container finished" podID="7bd0eef5-69f4-4bc9-9704-613d1828e300" containerID="e676d9b2aae50e93f18d5d4757435b6cbdb85c182665183b13b9e39da3861c5f" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.369538 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" event={"ID":"7bd0eef5-69f4-4bc9-9704-613d1828e300","Type":"ContainerDied","Data":"e676d9b2aae50e93f18d5d4757435b6cbdb85c182665183b13b9e39da3861c5f"} Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.394550 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" podStartSLOduration=2.394528658 podStartE2EDuration="2.394528658s" podCreationTimestamp="2026-03-20 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:02.387973547 +0000 UTC m=+1317.268999697" watchObservedRunningTime="2026-03-20 13:45:02.394528658 +0000 UTC m=+1317.275554788" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.476973 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-lxqp7" podStartSLOduration=4.476951318 podStartE2EDuration="4.476951318s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:02.463580759 +0000 UTC m=+1317.344606899" watchObservedRunningTime="2026-03-20 13:45:02.476951318 +0000 UTC m=+1317.357977448" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.826645 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.912906 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.913036 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.913078 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.913129 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5h7c\" (UniqueName: \"kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.913201 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.913218 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb\") pod \"7bd0eef5-69f4-4bc9-9704-613d1828e300\" (UID: \"7bd0eef5-69f4-4bc9-9704-613d1828e300\") " Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.919781 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c" (OuterVolumeSpecName: "kube-api-access-x5h7c") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "kube-api-access-x5h7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.952540 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.954412 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.965527 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4856]: I0320 13:45:02.973028 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config" (OuterVolumeSpecName: "config") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.017159 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.017193 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5h7c\" (UniqueName: \"kubernetes.io/projected/7bd0eef5-69f4-4bc9-9704-613d1828e300-kube-api-access-x5h7c\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.017205 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.017213 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.017221 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.057394 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7bd0eef5-69f4-4bc9-9704-613d1828e300" (UID: "7bd0eef5-69f4-4bc9-9704-613d1828e300"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.119149 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7bd0eef5-69f4-4bc9-9704-613d1828e300-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.378663 4856 generic.go:334] "Generic (PLEG): container finished" podID="2ab70ac0-2902-4f84-9142-060f5adee35b" containerID="4b97f3e939ad017e8453b293bc48ae202eb064df16e3bf1b2a34e44913f6d7c5" exitCode=0 Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.378724 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" event={"ID":"2ab70ac0-2902-4f84-9142-060f5adee35b","Type":"ContainerDied","Data":"4b97f3e939ad017e8453b293bc48ae202eb064df16e3bf1b2a34e44913f6d7c5"} Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.381945 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerStarted","Data":"12732f62cfa802749fb18fdb456d217c14720f555c765b16e7f7ba386c0a9021"} Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.384047 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" event={"ID":"7bd0eef5-69f4-4bc9-9704-613d1828e300","Type":"ContainerDied","Data":"d88ef3439c5e6ebb6b9b2027454c30a7cbfedc703817af2fbb25a2c02659eb7f"} Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.384078 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-z2rz7" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.384110 4856 scope.go:117] "RemoveContainer" containerID="e676d9b2aae50e93f18d5d4757435b6cbdb85c182665183b13b9e39da3861c5f" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.392136 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerStarted","Data":"5c19b863d51f607405948f1f335febc4615371a4b9139ac745bcb56b6c457724"} Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.392456 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-log" containerID="cri-o://f971831406407c61bf8f138c977f1dfe4dc36f8092931a2e112ee7d9776debed" gracePeriod=30 Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.392619 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-httpd" containerID="cri-o://5c19b863d51f607405948f1f335febc4615371a4b9139ac745bcb56b6c457724" gracePeriod=30 Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.402238 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" event={"ID":"0a15bbae-4b61-484e-a95f-e5de1b17650b","Type":"ContainerStarted","Data":"8fd0f2fd98f2558fe316a2a01472241f92264ffd1755b490daf56737d3b1d1e5"} Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.435664 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.43564447 podStartE2EDuration="5.43564447s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:03.43057728 +0000 UTC m=+1318.311603410" watchObservedRunningTime="2026-03-20 13:45:03.43564447 +0000 UTC m=+1318.316670600" Mar 20 13:45:03 crc kubenswrapper[4856]: E0320 13:45:03.455131 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bd0eef5_69f4_4bc9_9704_613d1828e300.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.476212 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" podStartSLOduration=4.476184946 podStartE2EDuration="4.476184946s" podCreationTimestamp="2026-03-20 13:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:03.463117165 +0000 UTC m=+1318.344143295" watchObservedRunningTime="2026-03-20 13:45:03.476184946 +0000 UTC m=+1318.357211076" Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.517672 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:45:03 crc kubenswrapper[4856]: I0320 13:45:03.524040 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-z2rz7"] Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:03.831847 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bd0eef5-69f4-4bc9-9704-613d1828e300" path="/var/lib/kubelet/pods/7bd0eef5-69f4-4bc9-9704-613d1828e300/volumes" Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.438684 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerStarted","Data":"e1ee12f7386549ed808ecbcffb41d31e25a502fd1ce37e8489b3a58bcaf0382f"} Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.438815 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-log" containerID="cri-o://12732f62cfa802749fb18fdb456d217c14720f555c765b16e7f7ba386c0a9021" gracePeriod=30 Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.439222 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-httpd" containerID="cri-o://e1ee12f7386549ed808ecbcffb41d31e25a502fd1ce37e8489b3a58bcaf0382f" gracePeriod=30 Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.459022 4856 generic.go:334] "Generic (PLEG): container finished" podID="23233316-ac61-4a01-ab66-de81d656e6d0" containerID="5c19b863d51f607405948f1f335febc4615371a4b9139ac745bcb56b6c457724" exitCode=143 Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.459077 4856 generic.go:334] "Generic (PLEG): container finished" podID="23233316-ac61-4a01-ab66-de81d656e6d0" containerID="f971831406407c61bf8f138c977f1dfe4dc36f8092931a2e112ee7d9776debed" exitCode=143 Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.459126 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerDied","Data":"5c19b863d51f607405948f1f335febc4615371a4b9139ac745bcb56b6c457724"} Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.459160 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerDied","Data":"f971831406407c61bf8f138c977f1dfe4dc36f8092931a2e112ee7d9776debed"} Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.459403 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:45:04 crc kubenswrapper[4856]: I0320 13:45:04.477701 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.477681356 podStartE2EDuration="6.477681356s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:04.45929462 +0000 UTC m=+1319.340320750" watchObservedRunningTime="2026-03-20 13:45:04.477681356 +0000 UTC m=+1319.358707486" Mar 20 13:45:05 crc kubenswrapper[4856]: I0320 13:45:05.471552 4856 generic.go:334] "Generic (PLEG): container finished" podID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerID="e1ee12f7386549ed808ecbcffb41d31e25a502fd1ce37e8489b3a58bcaf0382f" exitCode=0 Mar 20 13:45:05 crc kubenswrapper[4856]: I0320 13:45:05.472138 4856 generic.go:334] "Generic (PLEG): container finished" podID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerID="12732f62cfa802749fb18fdb456d217c14720f555c765b16e7f7ba386c0a9021" exitCode=143 Mar 20 13:45:05 crc kubenswrapper[4856]: I0320 13:45:05.471629 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerDied","Data":"e1ee12f7386549ed808ecbcffb41d31e25a502fd1ce37e8489b3a58bcaf0382f"} Mar 20 13:45:05 crc kubenswrapper[4856]: I0320 13:45:05.472301 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerDied","Data":"12732f62cfa802749fb18fdb456d217c14720f555c765b16e7f7ba386c0a9021"} Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.443957 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.450896 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.470882 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmrj5\" (UniqueName: \"kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5\") pod \"2ab70ac0-2902-4f84-9142-060f5adee35b\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.470971 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume\") pod \"2ab70ac0-2902-4f84-9142-060f5adee35b\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471022 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume\") pod \"2ab70ac0-2902-4f84-9142-060f5adee35b\" (UID: \"2ab70ac0-2902-4f84-9142-060f5adee35b\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471075 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471170 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rh2xj\" (UniqueName: \"kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471291 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471545 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471583 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471612 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471673 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.471716 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"23233316-ac61-4a01-ab66-de81d656e6d0\" (UID: \"23233316-ac61-4a01-ab66-de81d656e6d0\") " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.478777 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.479173 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume" (OuterVolumeSpecName: "config-volume") pod "2ab70ac0-2902-4f84-9142-060f5adee35b" (UID: "2ab70ac0-2902-4f84-9142-060f5adee35b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.479556 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.482650 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2ab70ac0-2902-4f84-9142-060f5adee35b" (UID: "2ab70ac0-2902-4f84-9142-060f5adee35b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.484757 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs" (OuterVolumeSpecName: "logs") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.489731 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj" (OuterVolumeSpecName: "kube-api-access-rh2xj") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "kube-api-access-rh2xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.491971 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5" (OuterVolumeSpecName: "kube-api-access-jmrj5") pod "2ab70ac0-2902-4f84-9142-060f5adee35b" (UID: "2ab70ac0-2902-4f84-9142-060f5adee35b"). InnerVolumeSpecName "kube-api-access-jmrj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.502595 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts" (OuterVolumeSpecName: "scripts") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.535111 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.549580 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23233316-ac61-4a01-ab66-de81d656e6d0","Type":"ContainerDied","Data":"e07b9ef1341a51e41454f94912d78f3b1953a31bf9d6bd1566499cb6c7b5524a"} Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.549637 4856 scope.go:117] "RemoveContainer" containerID="5c19b863d51f607405948f1f335febc4615371a4b9139ac745bcb56b6c457724" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.549757 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.563422 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" event={"ID":"2ab70ac0-2902-4f84-9142-060f5adee35b","Type":"ContainerDied","Data":"0b923184d3eff50c712c2311572dad98f915404e08a7f1421d310719da2f4e0b"} Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.563465 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b923184d3eff50c712c2311572dad98f915404e08a7f1421d310719da2f4e0b" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.563516 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.594617 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data" (OuterVolumeSpecName: "config-data") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595469 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595497 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595507 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmrj5\" (UniqueName: \"kubernetes.io/projected/2ab70ac0-2902-4f84-9142-060f5adee35b-kube-api-access-jmrj5\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595517 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ab70ac0-2902-4f84-9142-060f5adee35b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595525 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2ab70ac0-2902-4f84-9142-060f5adee35b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595533 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rh2xj\" (UniqueName: \"kubernetes.io/projected/23233316-ac61-4a01-ab66-de81d656e6d0-kube-api-access-rh2xj\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595542 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23233316-ac61-4a01-ab66-de81d656e6d0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595550 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595557 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595574 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.595765 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" containerID="cri-o://2dededf8886bfe05a51f84f7cf8f5ea61754b376513242d1d77f5adce3e9fe2a" gracePeriod=10 Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.599906 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.605561 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "23233316-ac61-4a01-ab66-de81d656e6d0" (UID: "23233316-ac61-4a01-ab66-de81d656e6d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.651156 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.697005 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.697219 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.697232 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23233316-ac61-4a01-ab66-de81d656e6d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.905540 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.932321 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.950332 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4856]: E0320 13:45:09.950800 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-log" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.950822 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-log" Mar 20 13:45:09 crc kubenswrapper[4856]: E0320 13:45:09.950840 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bd0eef5-69f4-4bc9-9704-613d1828e300" containerName="init" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.950848 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bd0eef5-69f4-4bc9-9704-613d1828e300" containerName="init" Mar 20 13:45:09 crc kubenswrapper[4856]: E0320 13:45:09.950871 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ab70ac0-2902-4f84-9142-060f5adee35b" containerName="collect-profiles" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.950878 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ab70ac0-2902-4f84-9142-060f5adee35b" containerName="collect-profiles" Mar 20 13:45:09 crc kubenswrapper[4856]: E0320 13:45:09.950895 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-httpd" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.950902 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-httpd" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.951124 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ab70ac0-2902-4f84-9142-060f5adee35b" containerName="collect-profiles" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.951142 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-log" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.951158 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" containerName="glance-httpd" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.951171 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bd0eef5-69f4-4bc9-9704-613d1828e300" containerName="init" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.952191 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.956055 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.975661 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.977753 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.987207 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:09 crc kubenswrapper[4856]: I0320 13:45:09.987308 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106348 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106415 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106459 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106523 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106567 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106612 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjwzj\" (UniqueName: \"kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106661 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.106770 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208101 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208161 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208209 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjwzj\" (UniqueName: \"kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208310 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208351 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208388 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208413 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.208442 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.210033 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.210633 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.211241 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.218305 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.220862 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.221427 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.226248 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.229653 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjwzj\" (UniqueName: \"kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.243387 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.404114 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.576676 4856 generic.go:334] "Generic (PLEG): container finished" podID="36374778-f2e8-453d-81bc-b76216ab56b3" containerID="2dededf8886bfe05a51f84f7cf8f5ea61754b376513242d1d77f5adce3e9fe2a" exitCode=0 Mar 20 13:45:10 crc kubenswrapper[4856]: I0320 13:45:10.576726 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" event={"ID":"36374778-f2e8-453d-81bc-b76216ab56b3","Type":"ContainerDied","Data":"2dededf8886bfe05a51f84f7cf8f5ea61754b376513242d1d77f5adce3e9fe2a"} Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.588182 4856 generic.go:334] "Generic (PLEG): container finished" podID="ed07030a-fbbf-4b61-9aa0-910b9c4ae087" containerID="02776c7a86559edcb0633594c636cd3cd3d9717439545e3e679b74290bdecf55" exitCode=0 Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.588535 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lxqp7" event={"ID":"ed07030a-fbbf-4b61-9aa0-910b9c4ae087","Type":"ContainerDied","Data":"02776c7a86559edcb0633594c636cd3cd3d9717439545e3e679b74290bdecf55"} Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.768718 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.841911 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23233316-ac61-4a01-ab66-de81d656e6d0" path="/var/lib/kubelet/pods/23233316-ac61-4a01-ab66-de81d656e6d0/volumes" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.942997 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943061 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943091 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943120 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943326 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943574 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943633 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.943690 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nx5j\" (UniqueName: \"kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j\") pod \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\" (UID: \"78f70590-3569-45fd-80cc-5ce6ec35b0b7\") " Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.944036 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs" (OuterVolumeSpecName: "logs") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.944427 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.944619 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.944637 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f70590-3569-45fd-80cc-5ce6ec35b0b7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.950318 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.950851 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j" (OuterVolumeSpecName: "kube-api-access-7nx5j") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "kube-api-access-7nx5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.963109 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts" (OuterVolumeSpecName: "scripts") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:11 crc kubenswrapper[4856]: I0320 13:45:11.984005 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.013883 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.014056 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data" (OuterVolumeSpecName: "config-data") pod "78f70590-3569-45fd-80cc-5ce6ec35b0b7" (UID: "78f70590-3569-45fd-80cc-5ce6ec35b0b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046474 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046507 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046517 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nx5j\" (UniqueName: \"kubernetes.io/projected/78f70590-3569-45fd-80cc-5ce6ec35b0b7-kube-api-access-7nx5j\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046527 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046558 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.046567 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f70590-3569-45fd-80cc-5ce6ec35b0b7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.065718 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.148575 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.881134 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.881243 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"78f70590-3569-45fd-80cc-5ce6ec35b0b7","Type":"ContainerDied","Data":"e2fcb51573f07e7a30bcb5c1d9de8ea066a66b00ca7f2e7f2ea352ab8fc44e21"} Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.941529 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.952084 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.959868 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:12 crc kubenswrapper[4856]: E0320 13:45:12.960302 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-log" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.960320 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-log" Mar 20 13:45:12 crc kubenswrapper[4856]: E0320 13:45:12.960339 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-httpd" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.960347 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-httpd" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.960561 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-httpd" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.960594 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" containerName="glance-log" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.961720 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.966254 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.966698 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:45:12 crc kubenswrapper[4856]: I0320 13:45:12.985736 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161262 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161852 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161879 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161929 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161950 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.161973 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.162112 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.162130 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdbh6\" (UniqueName: \"kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.263935 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264451 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdbh6\" (UniqueName: \"kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264391 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264679 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264713 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264736 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264774 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264789 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.264807 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.266405 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.267637 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.270016 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.270451 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.271097 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.272126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.284219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdbh6\" (UniqueName: \"kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.289927 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.586991 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:13 crc kubenswrapper[4856]: I0320 13:45:13.834837 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f70590-3569-45fd-80cc-5ce6ec35b0b7" path="/var/lib/kubelet/pods/78f70590-3569-45fd-80cc-5ce6ec35b0b7/volumes" Mar 20 13:45:17 crc kubenswrapper[4856]: I0320 13:45:17.922687 4856 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod8c34367e-1bb1-4e1d-8a11-190bca797f8e"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod8c34367e-1bb1-4e1d-8a11-190bca797f8e] : Timed out while waiting for systemd to remove kubepods-besteffort-pod8c34367e_1bb1_4e1d_8a11_190bca797f8e.slice" Mar 20 13:45:18 crc kubenswrapper[4856]: I0320 13:45:18.514506 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout" Mar 20 13:45:23 crc kubenswrapper[4856]: I0320 13:45:23.515439 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout" Mar 20 13:45:25 crc kubenswrapper[4856]: I0320 13:45:25.762947 4856 scope.go:117] "RemoveContainer" containerID="f971831406407c61bf8f138c977f1dfe4dc36f8092931a2e112ee7d9776debed" Mar 20 13:45:26 crc kubenswrapper[4856]: E0320 13:45:26.228937 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 13:45:26 crc kubenswrapper[4856]: E0320 13:45:26.229459 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nch64bh59bh5b4h67ch554h66bh66h9bh578hf6hd8h67ch599h594h68fh5d4h6bh5b8h577h57dhddh675hddh59ch578h65fhf4h5cfh57fhdch588q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-grmgz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(cda26562-2768-4d20-8aec-342cb3bf6b8c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:26 crc kubenswrapper[4856]: I0320 13:45:26.940107 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:45:26 crc kubenswrapper[4856]: I0320 13:45:26.948694 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:45:26 crc kubenswrapper[4856]: E0320 13:45:26.997902 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:45:26 crc kubenswrapper[4856]: E0320 13:45:26.998023 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ml4l6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-52xvf_openstack(e39393cf-dda0-4755-8e66-fc571afa2a1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:26 crc kubenswrapper[4856]: E0320 13:45:26.999933 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-52xvf" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.049103 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.049647 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.049751 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.049855 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.049941 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5g2b\" (UniqueName: \"kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050036 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050128 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050240 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwg27\" (UniqueName: \"kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050383 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050519 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050626 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config\") pod \"36374778-f2e8-453d-81bc-b76216ab56b3\" (UID: \"36374778-f2e8-453d-81bc-b76216ab56b3\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.050770 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts\") pod \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\" (UID: \"ed07030a-fbbf-4b61-9aa0-910b9c4ae087\") " Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.055249 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts" (OuterVolumeSpecName: "scripts") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.060030 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.062513 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b" (OuterVolumeSpecName: "kube-api-access-n5g2b") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "kube-api-access-n5g2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.062627 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.076357 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27" (OuterVolumeSpecName: "kube-api-access-jwg27") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "kube-api-access-jwg27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.083637 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" event={"ID":"36374778-f2e8-453d-81bc-b76216ab56b3","Type":"ContainerDied","Data":"3ab7adf506da3ffbbb084edf5725d2ae79d46e27c9ec961d9afbc8c89add1778"} Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.083766 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.088492 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-lxqp7" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.088743 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-lxqp7" event={"ID":"ed07030a-fbbf-4b61-9aa0-910b9c4ae087","Type":"ContainerDied","Data":"8e17700b3b3ec3b2d20d515c791ed4488721299008f7ab6abbe2db0778f5ce0d"} Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.088793 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e17700b3b3ec3b2d20d515c791ed4488721299008f7ab6abbe2db0778f5ce0d" Mar 20 13:45:27 crc kubenswrapper[4856]: E0320 13:45:27.091919 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-52xvf" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.108911 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.113448 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.114102 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.114884 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data" (OuterVolumeSpecName: "config-data") pod "ed07030a-fbbf-4b61-9aa0-910b9c4ae087" (UID: "ed07030a-fbbf-4b61-9aa0-910b9c4ae087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.117607 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.121321 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config" (OuterVolumeSpecName: "config") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.125815 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "36374778-f2e8-453d-81bc-b76216ab56b3" (UID: "36374778-f2e8-453d-81bc-b76216ab56b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153308 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153376 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153394 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153431 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153443 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5g2b\" (UniqueName: \"kubernetes.io/projected/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-kube-api-access-n5g2b\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153456 4856 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153468 4856 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153479 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwg27\" (UniqueName: \"kubernetes.io/projected/36374778-f2e8-453d-81bc-b76216ab56b3-kube-api-access-jwg27\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153519 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153528 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153536 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36374778-f2e8-453d-81bc-b76216ab56b3-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.153544 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed07030a-fbbf-4b61-9aa0-910b9c4ae087-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.418799 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.425841 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-kvcg6"] Mar 20 13:45:27 crc kubenswrapper[4856]: I0320 13:45:27.829964 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" path="/var/lib/kubelet/pods/36374778-f2e8-453d-81bc-b76216ab56b3/volumes" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.047086 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-lxqp7"] Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.056219 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-lxqp7"] Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.148784 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-97mlx"] Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.149836 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed07030a-fbbf-4b61-9aa0-910b9c4ae087" containerName="keystone-bootstrap" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.149860 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed07030a-fbbf-4b61-9aa0-910b9c4ae087" containerName="keystone-bootstrap" Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.149910 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="init" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.149918 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="init" Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.149946 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.149954 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.150950 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed07030a-fbbf-4b61-9aa0-910b9c4ae087" containerName="keystone-bootstrap" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.150995 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.152635 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.163091 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-97mlx"] Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.166022 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.166421 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.166721 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.166964 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.166968 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-svnn6" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271459 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271529 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271617 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271649 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271681 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m6gg\" (UniqueName: \"kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.271735 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372700 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m6gg\" (UniqueName: \"kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372765 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372850 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372947 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.372976 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.379006 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.380061 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.380263 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.380515 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.381651 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.404469 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m6gg\" (UniqueName: \"kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg\") pod \"keystone-bootstrap-97mlx\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.416876 4856 scope.go:117] "RemoveContainer" containerID="e1ee12f7386549ed808ecbcffb41d31e25a502fd1ce37e8489b3a58bcaf0382f" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.494186 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.494903 4856 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.495078 4856 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rpbtj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-n4czn_openstack(0a11a777-2932-4a56-898d-2de11472cbc9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:28 crc kubenswrapper[4856]: E0320 13:45:28.496283 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-n4czn" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.516613 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-kvcg6" podUID="36374778-f2e8-453d-81bc-b76216ab56b3" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.140:5353: i/o timeout" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.534444 4856 scope.go:117] "RemoveContainer" containerID="12732f62cfa802749fb18fdb456d217c14720f555c765b16e7f7ba386c0a9021" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.609488 4856 scope.go:117] "RemoveContainer" containerID="2dededf8886bfe05a51f84f7cf8f5ea61754b376513242d1d77f5adce3e9fe2a" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.643482 4856 scope.go:117] "RemoveContainer" containerID="9314c73d5a46200720ca35902f50b8e44d58eaf3110c01669944e1e90e9ae266" Mar 20 13:45:28 crc kubenswrapper[4856]: I0320 13:45:28.959051 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:29 crc kubenswrapper[4856]: I0320 13:45:29.044542 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-97mlx"] Mar 20 13:45:29 crc kubenswrapper[4856]: I0320 13:45:29.070876 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:29 crc kubenswrapper[4856]: I0320 13:45:29.105617 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fzpqw" event={"ID":"394ec9f9-f47c-4f12-af34-26a3953f7668","Type":"ContainerStarted","Data":"f708b70995f625b5570b66b4c5d623f049c5b186f6b7a641284f83c29309e266"} Mar 20 13:45:29 crc kubenswrapper[4856]: E0320 13:45:29.112898 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-n4czn" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" Mar 20 13:45:29 crc kubenswrapper[4856]: I0320 13:45:29.128129 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fzpqw" podStartSLOduration=4.48213603 podStartE2EDuration="31.12811356s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="2026-03-20 13:45:00.196518327 +0000 UTC m=+1315.077544457" lastFinishedPulling="2026-03-20 13:45:26.842495857 +0000 UTC m=+1341.723521987" observedRunningTime="2026-03-20 13:45:29.122880036 +0000 UTC m=+1344.003906166" watchObservedRunningTime="2026-03-20 13:45:29.12811356 +0000 UTC m=+1344.009139690" Mar 20 13:45:29 crc kubenswrapper[4856]: W0320 13:45:29.205654 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b717ce0_8fd1_454d_910d_d663dbc1b07a.slice/crio-e5b8e9ed9ba604ee12eef9d71346c30c2d9fcff73f370a3cd12d6102aab9a844 WatchSource:0}: Error finding container e5b8e9ed9ba604ee12eef9d71346c30c2d9fcff73f370a3cd12d6102aab9a844: Status 404 returned error can't find the container with id e5b8e9ed9ba604ee12eef9d71346c30c2d9fcff73f370a3cd12d6102aab9a844 Mar 20 13:45:29 crc kubenswrapper[4856]: W0320 13:45:29.209398 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb203ab60_5ec5_4897_87bb_915b96c106ed.slice/crio-01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4 WatchSource:0}: Error finding container 01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4: Status 404 returned error can't find the container with id 01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4 Mar 20 13:45:29 crc kubenswrapper[4856]: W0320 13:45:29.211583 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e3128e_5cf2_432f_b268_090de59c9722.slice/crio-d8e2c6c0ee5be608519d178007e0dc8baba4371a461fcd2ca99b298885be7198 WatchSource:0}: Error finding container d8e2c6c0ee5be608519d178007e0dc8baba4371a461fcd2ca99b298885be7198: Status 404 returned error can't find the container with id d8e2c6c0ee5be608519d178007e0dc8baba4371a461fcd2ca99b298885be7198 Mar 20 13:45:29 crc kubenswrapper[4856]: I0320 13:45:29.833331 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed07030a-fbbf-4b61-9aa0-910b9c4ae087" path="/var/lib/kubelet/pods/ed07030a-fbbf-4b61-9aa0-910b9c4ae087/volumes" Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.133191 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97mlx" event={"ID":"b203ab60-5ec5-4897-87bb-915b96c106ed","Type":"ContainerStarted","Data":"929bd48ca16b4b45e1ae843fe0a9e263670cc6925e68d991b3b7e682231f8a91"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.133238 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97mlx" event={"ID":"b203ab60-5ec5-4897-87bb-915b96c106ed","Type":"ContainerStarted","Data":"01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.137805 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerStarted","Data":"7297b820cc2bb8b7bd86556f1cb432b985e0eaab626721bdd32c58f8eec3968d"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.137853 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerStarted","Data":"d8e2c6c0ee5be608519d178007e0dc8baba4371a461fcd2ca99b298885be7198"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.140001 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerStarted","Data":"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.151261 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-97mlx" podStartSLOduration=2.151239815 podStartE2EDuration="2.151239815s" podCreationTimestamp="2026-03-20 13:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:30.146792093 +0000 UTC m=+1345.027818223" watchObservedRunningTime="2026-03-20 13:45:30.151239815 +0000 UTC m=+1345.032265945" Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.162303 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerStarted","Data":"72c22eeafe724ec37dd78b92a7210b42a72b436fec39dc56fb830bb3d35f4900"} Mar 20 13:45:30 crc kubenswrapper[4856]: I0320 13:45:30.162347 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerStarted","Data":"e5b8e9ed9ba604ee12eef9d71346c30c2d9fcff73f370a3cd12d6102aab9a844"} Mar 20 13:45:31 crc kubenswrapper[4856]: I0320 13:45:31.172802 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerStarted","Data":"1b4227016e0c8a4697775639aa23ccd7ad77dbae18ac0cc519c71be221f4e243"} Mar 20 13:45:31 crc kubenswrapper[4856]: I0320 13:45:31.175317 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerStarted","Data":"64e0034feb09eb16bc21de5cb0df509b5c8da8bd04ba03e79ab00d49d565ba4e"} Mar 20 13:45:31 crc kubenswrapper[4856]: I0320 13:45:31.192816 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=19.192795849 podStartE2EDuration="19.192795849s" podCreationTimestamp="2026-03-20 13:45:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:31.191824402 +0000 UTC m=+1346.072850532" watchObservedRunningTime="2026-03-20 13:45:31.192795849 +0000 UTC m=+1346.073821979" Mar 20 13:45:31 crc kubenswrapper[4856]: I0320 13:45:31.221509 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=22.22148777 podStartE2EDuration="22.22148777s" podCreationTimestamp="2026-03-20 13:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:31.212917213 +0000 UTC m=+1346.093943343" watchObservedRunningTime="2026-03-20 13:45:31.22148777 +0000 UTC m=+1346.102513900" Mar 20 13:45:33 crc kubenswrapper[4856]: I0320 13:45:33.588245 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:33 crc kubenswrapper[4856]: I0320 13:45:33.588626 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:33 crc kubenswrapper[4856]: I0320 13:45:33.626762 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:33 crc kubenswrapper[4856]: I0320 13:45:33.646942 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4856]: I0320 13:45:34.212865 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4856]: I0320 13:45:34.212960 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:36 crc kubenswrapper[4856]: I0320 13:45:36.220561 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:36 crc kubenswrapper[4856]: I0320 13:45:36.235954 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerStarted","Data":"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda"} Mar 20 13:45:36 crc kubenswrapper[4856]: I0320 13:45:36.238579 4856 generic.go:334] "Generic (PLEG): container finished" podID="b203ab60-5ec5-4897-87bb-915b96c106ed" containerID="929bd48ca16b4b45e1ae843fe0a9e263670cc6925e68d991b3b7e682231f8a91" exitCode=0 Mar 20 13:45:36 crc kubenswrapper[4856]: I0320 13:45:36.239899 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97mlx" event={"ID":"b203ab60-5ec5-4897-87bb-915b96c106ed","Type":"ContainerDied","Data":"929bd48ca16b4b45e1ae843fe0a9e263670cc6925e68d991b3b7e682231f8a91"} Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.131657 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.255944 4856 generic.go:334] "Generic (PLEG): container finished" podID="394ec9f9-f47c-4f12-af34-26a3953f7668" containerID="f708b70995f625b5570b66b4c5d623f049c5b186f6b7a641284f83c29309e266" exitCode=0 Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.256169 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fzpqw" event={"ID":"394ec9f9-f47c-4f12-af34-26a3953f7668","Type":"ContainerDied","Data":"f708b70995f625b5570b66b4c5d623f049c5b186f6b7a641284f83c29309e266"} Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.577457 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641030 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8m6gg\" (UniqueName: \"kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641092 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641140 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641181 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641307 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.641379 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys\") pod \"b203ab60-5ec5-4897-87bb-915b96c106ed\" (UID: \"b203ab60-5ec5-4897-87bb-915b96c106ed\") " Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.654738 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts" (OuterVolumeSpecName: "scripts") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.654836 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.655031 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.655938 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg" (OuterVolumeSpecName: "kube-api-access-8m6gg") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "kube-api-access-8m6gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.669925 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data" (OuterVolumeSpecName: "config-data") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.671345 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b203ab60-5ec5-4897-87bb-915b96c106ed" (UID: "b203ab60-5ec5-4897-87bb-915b96c106ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743492 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743538 4856 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743551 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8m6gg\" (UniqueName: \"kubernetes.io/projected/b203ab60-5ec5-4897-87bb-915b96c106ed-kube-api-access-8m6gg\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743562 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743573 4856 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:37 crc kubenswrapper[4856]: I0320 13:45:37.743584 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b203ab60-5ec5-4897-87bb-915b96c106ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.274238 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-97mlx" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.274469 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-97mlx" event={"ID":"b203ab60-5ec5-4897-87bb-915b96c106ed","Type":"ContainerDied","Data":"01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4"} Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.274521 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a65f2e23852149b1634fb0c3c8eaf896c2fc02860d4a26f7c7884466740bb4" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.491704 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:45:38 crc kubenswrapper[4856]: E0320 13:45:38.493672 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b203ab60-5ec5-4897-87bb-915b96c106ed" containerName="keystone-bootstrap" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.493696 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b203ab60-5ec5-4897-87bb-915b96c106ed" containerName="keystone-bootstrap" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.493885 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b203ab60-5ec5-4897-87bb-915b96c106ed" containerName="keystone-bootstrap" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.496067 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.500289 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.500313 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.501072 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.502351 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-svnn6" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.502576 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.503206 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.503477 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556223 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556267 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556298 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556329 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556350 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556372 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8v66\" (UniqueName: \"kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.556518 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.641992 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fzpqw" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657624 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657671 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657705 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657721 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657738 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8v66\" (UniqueName: \"kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657799 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657858 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.657876 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.664689 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.664962 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.665197 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.666735 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.668777 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.677776 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.677896 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.679679 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8v66\" (UniqueName: \"kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66\") pod \"keystone-7c6b6b7976-vc6rm\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759190 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts\") pod \"394ec9f9-f47c-4f12-af34-26a3953f7668\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759308 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ks48\" (UniqueName: \"kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48\") pod \"394ec9f9-f47c-4f12-af34-26a3953f7668\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759418 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs\") pod \"394ec9f9-f47c-4f12-af34-26a3953f7668\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759494 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data\") pod \"394ec9f9-f47c-4f12-af34-26a3953f7668\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759517 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle\") pod \"394ec9f9-f47c-4f12-af34-26a3953f7668\" (UID: \"394ec9f9-f47c-4f12-af34-26a3953f7668\") " Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.759866 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs" (OuterVolumeSpecName: "logs") pod "394ec9f9-f47c-4f12-af34-26a3953f7668" (UID: "394ec9f9-f47c-4f12-af34-26a3953f7668"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.762265 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts" (OuterVolumeSpecName: "scripts") pod "394ec9f9-f47c-4f12-af34-26a3953f7668" (UID: "394ec9f9-f47c-4f12-af34-26a3953f7668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.763320 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48" (OuterVolumeSpecName: "kube-api-access-2ks48") pod "394ec9f9-f47c-4f12-af34-26a3953f7668" (UID: "394ec9f9-f47c-4f12-af34-26a3953f7668"). InnerVolumeSpecName "kube-api-access-2ks48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.783808 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "394ec9f9-f47c-4f12-af34-26a3953f7668" (UID: "394ec9f9-f47c-4f12-af34-26a3953f7668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.787124 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data" (OuterVolumeSpecName: "config-data") pod "394ec9f9-f47c-4f12-af34-26a3953f7668" (UID: "394ec9f9-f47c-4f12-af34-26a3953f7668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.824720 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.860953 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/394ec9f9-f47c-4f12-af34-26a3953f7668-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.860982 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.860991 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.861001 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/394ec9f9-f47c-4f12-af34-26a3953f7668-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:38 crc kubenswrapper[4856]: I0320 13:45:38.861009 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ks48\" (UniqueName: \"kubernetes.io/projected/394ec9f9-f47c-4f12-af34-26a3953f7668-kube-api-access-2ks48\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.285443 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fzpqw" event={"ID":"394ec9f9-f47c-4f12-af34-26a3953f7668","Type":"ContainerDied","Data":"4ced0aafb50dd0080bfad15ce272a7e717951538844fa499140640cb53be21dd"} Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.285751 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ced0aafb50dd0080bfad15ce272a7e717951538844fa499140640cb53be21dd" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.285817 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fzpqw" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.375152 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:45:39 crc kubenswrapper[4856]: E0320 13:45:39.376193 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="394ec9f9-f47c-4f12-af34-26a3953f7668" containerName="placement-db-sync" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.376217 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="394ec9f9-f47c-4f12-af34-26a3953f7668" containerName="placement-db-sync" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.376430 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="394ec9f9-f47c-4f12-af34-26a3953f7668" containerName="placement-db-sync" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.377798 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.381312 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.381482 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-cj7cv" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.381487 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.381850 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.381923 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.388077 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.456630 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469288 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469375 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469403 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469423 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgcm4\" (UniqueName: \"kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469532 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469681 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.469814 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571162 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571681 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571717 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgcm4\" (UniqueName: \"kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571751 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571804 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571852 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.571912 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.572598 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.575455 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.576375 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.577507 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.577735 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.580657 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.591956 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgcm4\" (UniqueName: \"kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4\") pod \"placement-54db87fb-r8q4w\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.699543 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.987499 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.987580 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.987629 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.989088 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:45:39 crc kubenswrapper[4856]: I0320 13:45:39.989228 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8" gracePeriod=600 Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.296681 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c6b6b7976-vc6rm" event={"ID":"1ac0adc6-d09a-4367-838e-67f78ae5a050","Type":"ContainerStarted","Data":"0b6e7120a465c758f2bb754355b306589c204a9f6ee82cd864aa258ed424cc02"} Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.404522 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.404771 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.404782 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.404790 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.454888 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.464509 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:45:40 crc kubenswrapper[4856]: W0320 13:45:40.470126 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda70b9b91_b663_40a8_a2a8_f1f57fc17bab.slice/crio-e62cc3f6af91da5a8582e75d8459868ec9f682889b5dd798e4b76298a0ae5600 WatchSource:0}: Error finding container e62cc3f6af91da5a8582e75d8459868ec9f682889b5dd798e4b76298a0ae5600: Status 404 returned error can't find the container with id e62cc3f6af91da5a8582e75d8459868ec9f682889b5dd798e4b76298a0ae5600 Mar 20 13:45:40 crc kubenswrapper[4856]: I0320 13:45:40.477312 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.309818 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c6b6b7976-vc6rm" event={"ID":"1ac0adc6-d09a-4367-838e-67f78ae5a050","Type":"ContainerStarted","Data":"4eb1f6a354f4bcf7cd6a7759bd2b31a120d7b8a455f854f1604a17e887048c77"} Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.313911 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8" exitCode=0 Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.313966 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8"} Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.314039 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585"} Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.314120 4856 scope.go:117] "RemoveContainer" containerID="ed353fb5cf95e10b7e1c35f279b8ccee01a3d5dab85506323310bf7e266f5129" Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.316443 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerStarted","Data":"1851f20ad50d19aed32d3be103e4e1bc3e4b3415498ed43e1a10c91964f72276"} Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.316465 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerStarted","Data":"e62cc3f6af91da5a8582e75d8459868ec9f682889b5dd798e4b76298a0ae5600"} Mar 20 13:45:41 crc kubenswrapper[4856]: I0320 13:45:41.330705 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7c6b6b7976-vc6rm" podStartSLOduration=3.330678385 podStartE2EDuration="3.330678385s" podCreationTimestamp="2026-03-20 13:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:41.328023382 +0000 UTC m=+1356.209049522" watchObservedRunningTime="2026-03-20 13:45:41.330678385 +0000 UTC m=+1356.211704525" Mar 20 13:45:42 crc kubenswrapper[4856]: I0320 13:45:42.344249 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerStarted","Data":"89e762b7150be959d006f5453a95ec72bc93fd09878cdcfd7e75bf3a0eb48e5a"} Mar 20 13:45:42 crc kubenswrapper[4856]: I0320 13:45:42.344812 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:45:42 crc kubenswrapper[4856]: I0320 13:45:42.345622 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:42 crc kubenswrapper[4856]: I0320 13:45:42.345664 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:45:42 crc kubenswrapper[4856]: I0320 13:45:42.376754 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54db87fb-r8q4w" podStartSLOduration=3.376732542 podStartE2EDuration="3.376732542s" podCreationTimestamp="2026-03-20 13:45:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:42.372345001 +0000 UTC m=+1357.253371151" watchObservedRunningTime="2026-03-20 13:45:42.376732542 +0000 UTC m=+1357.257758672" Mar 20 13:45:43 crc kubenswrapper[4856]: I0320 13:45:43.229954 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:45:43 crc kubenswrapper[4856]: I0320 13:45:43.352420 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:45:43 crc kubenswrapper[4856]: I0320 13:45:43.374447 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:45:48 crc kubenswrapper[4856]: E0320 13:45:48.074186 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.405682 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerStarted","Data":"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8"} Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.405876 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="ceilometer-notification-agent" containerID="cri-o://ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b" gracePeriod=30 Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.405922 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.405954 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="proxy-httpd" containerID="cri-o://60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8" gracePeriod=30 Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.405981 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="sg-core" containerID="cri-o://c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda" gracePeriod=30 Mar 20 13:45:48 crc kubenswrapper[4856]: I0320 13:45:48.408578 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-52xvf" event={"ID":"e39393cf-dda0-4755-8e66-fc571afa2a1a","Type":"ContainerStarted","Data":"ac2cf0183696c7efdb1e246690f9b18fbccf038a604105e4af9135030cb8a10c"} Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.420162 4856 generic.go:334] "Generic (PLEG): container finished" podID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerID="60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8" exitCode=0 Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.420448 4856 generic.go:334] "Generic (PLEG): container finished" podID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerID="c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda" exitCode=2 Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.420510 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerDied","Data":"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8"} Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.420534 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerDied","Data":"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda"} Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.421518 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n4czn" event={"ID":"0a11a777-2932-4a56-898d-2de11472cbc9","Type":"ContainerStarted","Data":"727b99f476c9dba3f82144f27425c67b34ca4340ff2eb5563dea3b3ca28872a6"} Mar 20 13:45:49 crc kubenswrapper[4856]: I0320 13:45:49.446743 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-52xvf" podStartSLOduration=4.004418315 podStartE2EDuration="51.446726351s" podCreationTimestamp="2026-03-20 13:44:58 +0000 UTC" firstStartedPulling="2026-03-20 13:45:00.348671928 +0000 UTC m=+1315.229698058" lastFinishedPulling="2026-03-20 13:45:47.790979964 +0000 UTC m=+1362.672006094" observedRunningTime="2026-03-20 13:45:48.462738983 +0000 UTC m=+1363.343765123" watchObservedRunningTime="2026-03-20 13:45:49.446726351 +0000 UTC m=+1364.327752481" Mar 20 13:45:53 crc kubenswrapper[4856]: I0320 13:45:53.974177 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.000659 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-n4czn" podStartSLOduration=7.535960058 podStartE2EDuration="55.000644131s" podCreationTimestamp="2026-03-20 13:44:59 +0000 UTC" firstStartedPulling="2026-03-20 13:45:00.32952463 +0000 UTC m=+1315.210550760" lastFinishedPulling="2026-03-20 13:45:47.794208683 +0000 UTC m=+1362.675234833" observedRunningTime="2026-03-20 13:45:49.448438838 +0000 UTC m=+1364.329464968" watchObservedRunningTime="2026-03-20 13:45:54.000644131 +0000 UTC m=+1368.881670261" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.138991 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmgz\" (UniqueName: \"kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139121 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139263 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139365 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139399 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139514 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle\") pod \"cda26562-2768-4d20-8aec-342cb3bf6b8c\" (UID: \"cda26562-2768-4d20-8aec-342cb3bf6b8c\") " Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.139605 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.140043 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.140765 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.145244 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz" (OuterVolumeSpecName: "kube-api-access-grmgz") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "kube-api-access-grmgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.149405 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts" (OuterVolumeSpecName: "scripts") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.174155 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.192230 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.241574 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.241614 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cda26562-2768-4d20-8aec-342cb3bf6b8c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.241627 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.241640 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.241652 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmgz\" (UniqueName: \"kubernetes.io/projected/cda26562-2768-4d20-8aec-342cb3bf6b8c-kube-api-access-grmgz\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.247044 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data" (OuterVolumeSpecName: "config-data") pod "cda26562-2768-4d20-8aec-342cb3bf6b8c" (UID: "cda26562-2768-4d20-8aec-342cb3bf6b8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.342860 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cda26562-2768-4d20-8aec-342cb3bf6b8c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.465565 4856 generic.go:334] "Generic (PLEG): container finished" podID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerID="ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b" exitCode=0 Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.465624 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerDied","Data":"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b"} Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.465664 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cda26562-2768-4d20-8aec-342cb3bf6b8c","Type":"ContainerDied","Data":"79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78"} Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.465687 4856 scope.go:117] "RemoveContainer" containerID="60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.465683 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.496945 4856 scope.go:117] "RemoveContainer" containerID="c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.525616 4856 scope.go:117] "RemoveContainer" containerID="ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.534256 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.553298 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.566692 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.567212 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="proxy-httpd" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567230 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="proxy-httpd" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.567243 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="sg-core" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567249 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="sg-core" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.567363 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="ceilometer-notification-agent" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567372 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="ceilometer-notification-agent" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567538 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="proxy-httpd" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567564 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="sg-core" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.567574 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" containerName="ceilometer-notification-agent" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.569374 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.574253 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.574791 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.578778 4856 scope.go:117] "RemoveContainer" containerID="60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.579514 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8\": container with ID starting with 60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8 not found: ID does not exist" containerID="60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.583477 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8"} err="failed to get container status \"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8\": rpc error: code = NotFound desc = could not find container \"60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8\": container with ID starting with 60998fad9aba877d062da6e47170233c92b26751993729905c1a6b5613c196c8 not found: ID does not exist" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.583553 4856 scope.go:117] "RemoveContainer" containerID="c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.585499 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda\": container with ID starting with c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda not found: ID does not exist" containerID="c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.585536 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda"} err="failed to get container status \"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda\": rpc error: code = NotFound desc = could not find container \"c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda\": container with ID starting with c1d71892d811d991f6f79bdf47b383f9a463cc1c7c4bb5ce02d9ae8fcf907cda not found: ID does not exist" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.585586 4856 scope.go:117] "RemoveContainer" containerID="ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.585965 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b\": container with ID starting with ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b not found: ID does not exist" containerID="ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.586061 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b"} err="failed to get container status \"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b\": rpc error: code = NotFound desc = could not find container \"ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b\": container with ID starting with ea09f26b20d2e30498da75bf200f6728d229bb9247ae817ca8bbe38b1e0e3f5b not found: ID does not exist" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.599251 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648075 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648125 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tnrt\" (UniqueName: \"kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648327 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648373 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648448 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648588 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.648684 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.649905 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcda26562_2768_4d20_8aec_342cb3bf6b8c.slice/crio-79fc5623d8631ad263d48675c00b0870846b9e5a636f2f2a58e2da5996317e78\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcda26562_2768_4d20_8aec_342cb3bf6b8c.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.658024 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:54 crc kubenswrapper[4856]: E0320 13:45:54.658834 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-7tnrt log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="10a82375-18b3-443b-ab69-9c94c184a982" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.750534 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.750919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.750985 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751002 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tnrt\" (UniqueName: \"kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751063 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751083 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751108 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751585 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.751924 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.754797 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.755130 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.759965 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.760025 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:54 crc kubenswrapper[4856]: I0320 13:45:54.765514 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tnrt\" (UniqueName: \"kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt\") pod \"ceilometer-0\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " pod="openstack/ceilometer-0" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.474977 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.486769 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.666333 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.666401 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.666559 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.666732 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.667309 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.667378 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.667418 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.667448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tnrt\" (UniqueName: \"kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt\") pod \"10a82375-18b3-443b-ab69-9c94c184a982\" (UID: \"10a82375-18b3-443b-ab69-9c94c184a982\") " Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.668000 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.668066 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.670784 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.672103 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.672152 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt" (OuterVolumeSpecName: "kube-api-access-7tnrt") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "kube-api-access-7tnrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.675417 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts" (OuterVolumeSpecName: "scripts") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.679670 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data" (OuterVolumeSpecName: "config-data") pod "10a82375-18b3-443b-ab69-9c94c184a982" (UID: "10a82375-18b3-443b-ab69-9c94c184a982"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770028 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770072 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770086 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770097 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/10a82375-18b3-443b-ab69-9c94c184a982-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770109 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tnrt\" (UniqueName: \"kubernetes.io/projected/10a82375-18b3-443b-ab69-9c94c184a982-kube-api-access-7tnrt\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.770123 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/10a82375-18b3-443b-ab69-9c94c184a982-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:55 crc kubenswrapper[4856]: I0320 13:45:55.842927 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cda26562-2768-4d20-8aec-342cb3bf6b8c" path="/var/lib/kubelet/pods/cda26562-2768-4d20-8aec-342cb3bf6b8c/volumes" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.486865 4856 generic.go:334] "Generic (PLEG): container finished" podID="e39393cf-dda0-4755-8e66-fc571afa2a1a" containerID="ac2cf0183696c7efdb1e246690f9b18fbccf038a604105e4af9135030cb8a10c" exitCode=0 Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.486960 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.486968 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-52xvf" event={"ID":"e39393cf-dda0-4755-8e66-fc571afa2a1a","Type":"ContainerDied","Data":"ac2cf0183696c7efdb1e246690f9b18fbccf038a604105e4af9135030cb8a10c"} Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.553030 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.565092 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.577005 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.579642 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.583009 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.586735 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.588982 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686365 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686495 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686570 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686633 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686675 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686768 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.686861 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8zsk\" (UniqueName: \"kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.790834 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.790933 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.790971 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.791009 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.791081 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.791138 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.791191 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8zsk\" (UniqueName: \"kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.791646 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.792322 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.796383 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.797653 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.801203 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.802251 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.809879 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8zsk\" (UniqueName: \"kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk\") pod \"ceilometer-0\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " pod="openstack/ceilometer-0" Mar 20 13:45:56 crc kubenswrapper[4856]: I0320 13:45:56.912633 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.368787 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:57 crc kubenswrapper[4856]: W0320 13:45:57.374382 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb88341df_93ef_4159_9587_0dad1dfee698.slice/crio-a57b5c7749b09efa46d59fa4d815e6d6f0558c836486b9542ecbe882b0cc094a WatchSource:0}: Error finding container a57b5c7749b09efa46d59fa4d815e6d6f0558c836486b9542ecbe882b0cc094a: Status 404 returned error can't find the container with id a57b5c7749b09efa46d59fa4d815e6d6f0558c836486b9542ecbe882b0cc094a Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.497066 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerStarted","Data":"a57b5c7749b09efa46d59fa4d815e6d6f0558c836486b9542ecbe882b0cc094a"} Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.762638 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-52xvf" Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.834607 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10a82375-18b3-443b-ab69-9c94c184a982" path="/var/lib/kubelet/pods/10a82375-18b3-443b-ab69-9c94c184a982/volumes" Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.912699 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle\") pod \"e39393cf-dda0-4755-8e66-fc571afa2a1a\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.913085 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data\") pod \"e39393cf-dda0-4755-8e66-fc571afa2a1a\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.913155 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml4l6\" (UniqueName: \"kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6\") pod \"e39393cf-dda0-4755-8e66-fc571afa2a1a\" (UID: \"e39393cf-dda0-4755-8e66-fc571afa2a1a\") " Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.917602 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e39393cf-dda0-4755-8e66-fc571afa2a1a" (UID: "e39393cf-dda0-4755-8e66-fc571afa2a1a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.917847 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6" (OuterVolumeSpecName: "kube-api-access-ml4l6") pod "e39393cf-dda0-4755-8e66-fc571afa2a1a" (UID: "e39393cf-dda0-4755-8e66-fc571afa2a1a"). InnerVolumeSpecName "kube-api-access-ml4l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:57 crc kubenswrapper[4856]: I0320 13:45:57.938196 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e39393cf-dda0-4755-8e66-fc571afa2a1a" (UID: "e39393cf-dda0-4755-8e66-fc571afa2a1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.015218 4856 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.015245 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml4l6\" (UniqueName: \"kubernetes.io/projected/e39393cf-dda0-4755-8e66-fc571afa2a1a-kube-api-access-ml4l6\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.015257 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e39393cf-dda0-4755-8e66-fc571afa2a1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.506892 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-52xvf" event={"ID":"e39393cf-dda0-4755-8e66-fc571afa2a1a","Type":"ContainerDied","Data":"ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0"} Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.507123 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddc162d75f939084bd9e35ae9c5eab502b573b18b8ec8596a27e10f52e10eeb0" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.506977 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-52xvf" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.509260 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerStarted","Data":"10f7a523c9e1e33739d4020c5e345bc2af5bb975ce991bba158047ae4527f9f8"} Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.919239 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:45:58 crc kubenswrapper[4856]: E0320 13:45:58.919688 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" containerName="barbican-db-sync" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.919712 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" containerName="barbican-db-sync" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.919936 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" containerName="barbican-db-sync" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.921230 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.925683 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.925919 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-vfhl9" Mar 20 13:45:58 crc kubenswrapper[4856]: I0320 13:45:58.926232 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.039606 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.041229 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.041340 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.041390 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6bzt\" (UniqueName: \"kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.041409 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.041484 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.076331 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.077761 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.083600 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.093530 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.095025 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.103374 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.123072 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.142693 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.142790 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.142868 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.142890 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6bzt\" (UniqueName: \"kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.142906 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.143781 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.148872 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.152145 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.152518 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.153911 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.157977 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.158481 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.159068 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6bzt\" (UniqueName: \"kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt\") pod \"barbican-keystone-listener-65d8844bc8-mjgnh\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.177178 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246561 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246634 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246677 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246701 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246732 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246928 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.246996 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247102 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247135 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247172 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hcsn\" (UniqueName: \"kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247197 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247334 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88kj\" (UniqueName: \"kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247406 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247495 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247567 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.247657 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnwmt\" (UniqueName: \"kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.257673 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.349675 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.349982 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350024 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hcsn\" (UniqueName: \"kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350303 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350354 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88kj\" (UniqueName: \"kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350379 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350445 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350485 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.350628 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnwmt\" (UniqueName: \"kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351409 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351528 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351662 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351730 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351763 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351787 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351816 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.351847 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.352247 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.352815 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.353145 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.353262 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.353585 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.360002 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.361420 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.366019 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.367862 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.368755 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.369992 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.370289 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.371958 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hcsn\" (UniqueName: \"kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn\") pod \"barbican-worker-56d6d658ff-ch8jp\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.376145 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnwmt\" (UniqueName: \"kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt\") pod \"dnsmasq-dns-586bdc5f9-z8bwn\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.380964 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88kj\" (UniqueName: \"kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj\") pod \"barbican-api-b84bbc586-zmdmq\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.516015 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.516106 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.532677 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerStarted","Data":"f53ba23f2a02629780d5285471dd16c556b6e3756e0594441d441e3306b91f4b"} Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.533050 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.539690 4856 generic.go:334] "Generic (PLEG): container finished" podID="4e59b689-e9d4-460b-8a82-50770f4d4422" containerID="a366776a203a7fa9d5f08eb4671d855fe2eeb0585c1540357869f4722b8099e0" exitCode=0 Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.539818 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sllh9" event={"ID":"4e59b689-e9d4-460b-8a82-50770f4d4422","Type":"ContainerDied","Data":"a366776a203a7fa9d5f08eb4671d855fe2eeb0585c1540357869f4722b8099e0"} Mar 20 13:45:59 crc kubenswrapper[4856]: I0320 13:45:59.742490 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.146999 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-llv2r"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.148653 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.153527 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.153726 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.157646 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:46:00 crc kubenswrapper[4856]: W0320 13:46:00.157859 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee995e44_3c2c_4ca3_9945_b9b757269749.slice/crio-6138aca70e60643cb8d6b847c00370d88b03dfdea1ff22eb9ceff9d2c4e67ad7 WatchSource:0}: Error finding container 6138aca70e60643cb8d6b847c00370d88b03dfdea1ff22eb9ceff9d2c4e67ad7: Status 404 returned error can't find the container with id 6138aca70e60643cb8d6b847c00370d88b03dfdea1ff22eb9ceff9d2c4e67ad7 Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.157886 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-llv2r"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.168005 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.187330 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddff9\" (UniqueName: \"kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9\") pod \"auto-csr-approver-29566906-llv2r\" (UID: \"c376281a-ae9c-4057-a9ac-1ef731747830\") " pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.190370 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.289248 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddff9\" (UniqueName: \"kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9\") pod \"auto-csr-approver-29566906-llv2r\" (UID: \"c376281a-ae9c-4057-a9ac-1ef731747830\") " pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.312087 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.313552 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddff9\" (UniqueName: \"kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9\") pod \"auto-csr-approver-29566906-llv2r\" (UID: \"c376281a-ae9c-4057-a9ac-1ef731747830\") " pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.507533 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.552533 4856 generic.go:334] "Generic (PLEG): container finished" podID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerID="02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331" exitCode=0 Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.553005 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" event={"ID":"f09f71ba-b703-44fe-926d-7ea32c11c4c7","Type":"ContainerDied","Data":"02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.554008 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" event={"ID":"f09f71ba-b703-44fe-926d-7ea32c11c4c7","Type":"ContainerStarted","Data":"6b604876ac752507b79b7a0fa00e632f3ec0ef2b82405228cd79d782d9b80eb3"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.555889 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerStarted","Data":"12d0d455725d7a6ac68ec89b7a2303532724ab31cb990ca5bcfab6c6d996734c"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.561549 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerStarted","Data":"6138aca70e60643cb8d6b847c00370d88b03dfdea1ff22eb9ceff9d2c4e67ad7"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.564364 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerStarted","Data":"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.564530 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerStarted","Data":"c41f4f77a6dc9ae63974f56e918d9a950963a16f2a534230a5cc670d2791d92c"} Mar 20 13:46:00 crc kubenswrapper[4856]: I0320 13:46:00.579402 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerStarted","Data":"3173f66c5384ed0cfbea760af3ab62b4bb30132511cf60e93b439257feb355cc"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.014808 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-llv2r"] Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.099730 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sllh9" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.231815 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle\") pod \"4e59b689-e9d4-460b-8a82-50770f4d4422\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.231914 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8dqm\" (UniqueName: \"kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm\") pod \"4e59b689-e9d4-460b-8a82-50770f4d4422\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.232081 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config\") pod \"4e59b689-e9d4-460b-8a82-50770f4d4422\" (UID: \"4e59b689-e9d4-460b-8a82-50770f4d4422\") " Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.238206 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm" (OuterVolumeSpecName: "kube-api-access-s8dqm") pod "4e59b689-e9d4-460b-8a82-50770f4d4422" (UID: "4e59b689-e9d4-460b-8a82-50770f4d4422"). InnerVolumeSpecName "kube-api-access-s8dqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.260397 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e59b689-e9d4-460b-8a82-50770f4d4422" (UID: "4e59b689-e9d4-460b-8a82-50770f4d4422"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.265155 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config" (OuterVolumeSpecName: "config") pod "4e59b689-e9d4-460b-8a82-50770f4d4422" (UID: "4e59b689-e9d4-460b-8a82-50770f4d4422"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.334350 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.334411 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e59b689-e9d4-460b-8a82-50770f4d4422-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.334423 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8dqm\" (UniqueName: \"kubernetes.io/projected/4e59b689-e9d4-460b-8a82-50770f4d4422-kube-api-access-s8dqm\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.595554 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" event={"ID":"f09f71ba-b703-44fe-926d-7ea32c11c4c7","Type":"ContainerStarted","Data":"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.595995 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.604538 4856 generic.go:334] "Generic (PLEG): container finished" podID="0a11a777-2932-4a56-898d-2de11472cbc9" containerID="727b99f476c9dba3f82144f27425c67b34ca4340ff2eb5563dea3b3ca28872a6" exitCode=0 Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.604610 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n4czn" event={"ID":"0a11a777-2932-4a56-898d-2de11472cbc9","Type":"ContainerDied","Data":"727b99f476c9dba3f82144f27425c67b34ca4340ff2eb5563dea3b3ca28872a6"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.606802 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-llv2r" event={"ID":"c376281a-ae9c-4057-a9ac-1ef731747830","Type":"ContainerStarted","Data":"b236289044dbb8db1406851510909755dc190cc5c9a9c8fcad8353ff77549c79"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.609488 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerStarted","Data":"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.609796 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.609937 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.613783 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sllh9" event={"ID":"4e59b689-e9d4-460b-8a82-50770f4d4422","Type":"ContainerDied","Data":"0d1247cab4472b76ce7c1dc7cdce339226fad1fc69f7b84f2a7e9e7fd4737c60"} Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.613838 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d1247cab4472b76ce7c1dc7cdce339226fad1fc69f7b84f2a7e9e7fd4737c60" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.613920 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sllh9" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.615967 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" podStartSLOduration=3.615947248 podStartE2EDuration="3.615947248s" podCreationTimestamp="2026-03-20 13:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:01.614978251 +0000 UTC m=+1376.496004381" watchObservedRunningTime="2026-03-20 13:46:01.615947248 +0000 UTC m=+1376.496973378" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.651632 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b84bbc586-zmdmq" podStartSLOduration=2.6516088890000002 podStartE2EDuration="2.651608889s" podCreationTimestamp="2026-03-20 13:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:01.633717456 +0000 UTC m=+1376.514743616" watchObservedRunningTime="2026-03-20 13:46:01.651608889 +0000 UTC m=+1376.532635019" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.768433 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.797157 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:01 crc kubenswrapper[4856]: E0320 13:46:01.797593 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e59b689-e9d4-460b-8a82-50770f4d4422" containerName="neutron-db-sync" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.797618 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e59b689-e9d4-460b-8a82-50770f4d4422" containerName="neutron-db-sync" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.797874 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e59b689-e9d4-460b-8a82-50770f4d4422" containerName="neutron-db-sync" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.798963 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.816715 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.921175 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.925376 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.931104 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.933623 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.935902 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-sgh42" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.936552 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945491 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945530 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945609 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945632 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945649 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2ldw\" (UniqueName: \"kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.945707 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:01 crc kubenswrapper[4856]: I0320 13:46:01.950168 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047660 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047718 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047781 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047798 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jv2x\" (UniqueName: \"kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047831 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047907 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047959 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047978 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.047991 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2ldw\" (UniqueName: \"kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.048009 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.048032 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.049250 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.049974 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.050499 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.051164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.051670 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.082294 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2ldw\" (UniqueName: \"kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw\") pod \"dnsmasq-dns-85ff748b95-8zqrj\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.132474 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.149082 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.149125 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jv2x\" (UniqueName: \"kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.149202 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.149234 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.149285 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.156114 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.156117 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.166804 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.169302 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.205965 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jv2x\" (UniqueName: \"kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x\") pod \"neutron-67975c5cc6-2d96h\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.273811 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.391013 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.394029 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.396357 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.401377 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.409502 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.571476 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572244 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9ng8\" (UniqueName: \"kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572293 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572375 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572404 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572427 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.572476 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.673967 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674378 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674449 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674473 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9ng8\" (UniqueName: \"kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674558 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674592 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.674625 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.675986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.680623 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.680793 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.684239 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.686517 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.691601 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.706684 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9ng8\" (UniqueName: \"kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8\") pod \"barbican-api-6c47c6db4b-7s8m7\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.760255 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:02 crc kubenswrapper[4856]: I0320 13:46:02.996770 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.095591 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:03 crc kubenswrapper[4856]: W0320 13:46:03.134072 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2782efe_c7dc_4301_a897_cfe6a08aa7fb.slice/crio-303e8852dc2ce0e339ae1a0ccf198eb9a0fbaa193fa851df8183fa15807c76c3 WatchSource:0}: Error finding container 303e8852dc2ce0e339ae1a0ccf198eb9a0fbaa193fa851df8183fa15807c76c3: Status 404 returned error can't find the container with id 303e8852dc2ce0e339ae1a0ccf198eb9a0fbaa193fa851df8183fa15807c76c3 Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.354473 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:46:03 crc kubenswrapper[4856]: W0320 13:46:03.384660 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0be3924_19c6_4eee_bc60_7fbe28336b67.slice/crio-0e50e5a77f055c3cf693267e96a1e2305580bf0c20c030307a6e3c0f5314197d WatchSource:0}: Error finding container 0e50e5a77f055c3cf693267e96a1e2305580bf0c20c030307a6e3c0f5314197d: Status 404 returned error can't find the container with id 0e50e5a77f055c3cf693267e96a1e2305580bf0c20c030307a6e3c0f5314197d Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.437471 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n4czn" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.499968 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500014 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500041 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpbtj\" (UniqueName: \"kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500133 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500165 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500187 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id\") pod \"0a11a777-2932-4a56-898d-2de11472cbc9\" (UID: \"0a11a777-2932-4a56-898d-2de11472cbc9\") " Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.500461 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.504914 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.507829 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts" (OuterVolumeSpecName: "scripts") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.520693 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj" (OuterVolumeSpecName: "kube-api-access-rpbtj") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "kube-api-access-rpbtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.539749 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.556814 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data" (OuterVolumeSpecName: "config-data") pod "0a11a777-2932-4a56-898d-2de11472cbc9" (UID: "0a11a777-2932-4a56-898d-2de11472cbc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602090 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602143 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602157 4856 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0a11a777-2932-4a56-898d-2de11472cbc9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602168 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602181 4856 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0a11a777-2932-4a56-898d-2de11472cbc9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.602192 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpbtj\" (UniqueName: \"kubernetes.io/projected/0a11a777-2932-4a56-898d-2de11472cbc9-kube-api-access-rpbtj\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.638068 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerStarted","Data":"0e50e5a77f055c3cf693267e96a1e2305580bf0c20c030307a6e3c0f5314197d"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.639899 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerStarted","Data":"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.639930 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerStarted","Data":"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.642597 4856 generic.go:334] "Generic (PLEG): container finished" podID="c376281a-ae9c-4057-a9ac-1ef731747830" containerID="749708f3aae905141646bbe16b71edc1c0c7123765a07503a72806e70934b3c9" exitCode=0 Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.642656 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-llv2r" event={"ID":"c376281a-ae9c-4057-a9ac-1ef731747830","Type":"ContainerDied","Data":"749708f3aae905141646bbe16b71edc1c0c7123765a07503a72806e70934b3c9"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.644396 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" event={"ID":"e5c321b2-351b-4a8e-afe7-2a0345dc4112","Type":"ContainerStarted","Data":"5db82149abdb45516144f28199446f9c52236663bba5ea356b685996008cc808"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.647309 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerStarted","Data":"303e8852dc2ce0e339ae1a0ccf198eb9a0fbaa193fa851df8183fa15807c76c3"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.654845 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerStarted","Data":"981b8b827dab20c49a4b95a3ff976be714d7cf0a7f3e2f70b0c810a4c1492d48"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.654891 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerStarted","Data":"d930765c8702d78cbe6f1e7514f35eb8ca4969feb1ad881999f4f96ab179ba9c"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.656826 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerStarted","Data":"bdcd8efb221765de3539a010acc5add27b2a30f836c7b9510e6bb6cadef93087"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.657377 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.658809 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="dnsmasq-dns" containerID="cri-o://216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e" gracePeriod=10 Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.659100 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-n4czn" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.661390 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-n4czn" event={"ID":"0a11a777-2932-4a56-898d-2de11472cbc9","Type":"ContainerDied","Data":"b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5"} Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.661456 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06862401e8e38ce4d3ef1af2025c61aa03a45e4a0a4ad4e5e0831e37a708dd5" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.673675 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" podStartSLOduration=3.022506785 podStartE2EDuration="5.673657424s" podCreationTimestamp="2026-03-20 13:45:58 +0000 UTC" firstStartedPulling="2026-03-20 13:45:59.846109168 +0000 UTC m=+1374.727135298" lastFinishedPulling="2026-03-20 13:46:02.497259807 +0000 UTC m=+1377.378285937" observedRunningTime="2026-03-20 13:46:03.6669898 +0000 UTC m=+1378.548015940" watchObservedRunningTime="2026-03-20 13:46:03.673657424 +0000 UTC m=+1378.554683554" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.712780 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56d6d658ff-ch8jp" podStartSLOduration=3.330238869 podStartE2EDuration="5.712756521s" podCreationTimestamp="2026-03-20 13:45:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:00.178009418 +0000 UTC m=+1375.059035548" lastFinishedPulling="2026-03-20 13:46:02.56052707 +0000 UTC m=+1377.441553200" observedRunningTime="2026-03-20 13:46:03.696882384 +0000 UTC m=+1378.577908534" watchObservedRunningTime="2026-03-20 13:46:03.712756521 +0000 UTC m=+1378.593782661" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.742630 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.603127717 podStartE2EDuration="7.742604053s" podCreationTimestamp="2026-03-20 13:45:56 +0000 UTC" firstStartedPulling="2026-03-20 13:45:57.377999729 +0000 UTC m=+1372.259025869" lastFinishedPulling="2026-03-20 13:46:02.517476085 +0000 UTC m=+1377.398502205" observedRunningTime="2026-03-20 13:46:03.735828447 +0000 UTC m=+1378.616854587" watchObservedRunningTime="2026-03-20 13:46:03.742604053 +0000 UTC m=+1378.623630183" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.934982 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:03 crc kubenswrapper[4856]: E0320 13:46:03.935448 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" containerName="cinder-db-sync" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.935529 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" containerName="cinder-db-sync" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.935773 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" containerName="cinder-db-sync" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.937062 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.947146 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.947572 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-tndcb" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.947213 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.947263 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:46:03 crc kubenswrapper[4856]: I0320 13:46:03.976558 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010412 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010474 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010529 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010567 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010583 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv6nd\" (UniqueName: \"kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.010627 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.032195 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.081328 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.088122 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.109493 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112104 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112146 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112182 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112212 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv6nd\" (UniqueName: \"kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112228 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112252 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112294 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112331 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112371 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5f2\" (UniqueName: \"kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112390 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112431 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.112452 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.114039 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.128518 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.128986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.134509 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv6nd\" (UniqueName: \"kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.135311 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.145740 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts\") pod \"cinder-scheduler-0\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.176691 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.188208 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.190646 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.205327 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.214793 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.214859 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.214957 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5f2\" (UniqueName: \"kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.215014 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.215076 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.215116 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.216141 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.216688 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.217164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.225361 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.226261 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.277408 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.277987 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5f2\" (UniqueName: \"kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2\") pod \"dnsmasq-dns-5c9776ccc5-ztf4n\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317494 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317564 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317593 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317626 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317656 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317699 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfg6z\" (UniqueName: \"kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.317726 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419221 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419303 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419336 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419358 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419379 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419420 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfg6z\" (UniqueName: \"kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419440 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.419553 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.420033 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.431290 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.432830 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.435790 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.439535 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfg6z\" (UniqueName: \"kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.444103 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data\") pod \"cinder-api-0\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.509693 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.537794 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.578178 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.625935 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.626184 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.626215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.626256 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.626350 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnwmt\" (UniqueName: \"kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.626449 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config\") pod \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\" (UID: \"f09f71ba-b703-44fe-926d-7ea32c11c4c7\") " Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.640412 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt" (OuterVolumeSpecName: "kube-api-access-gnwmt") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "kube-api-access-gnwmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.728710 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnwmt\" (UniqueName: \"kubernetes.io/projected/f09f71ba-b703-44fe-926d-7ea32c11c4c7-kube-api-access-gnwmt\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.747530 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerStarted","Data":"f1443a7d3d86f2f83dec5d6552b8e2ae4efe9881c4210ba9a130c8efc2674cfa"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.747571 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerStarted","Data":"e9d320f8afc636ad4d1326713391272c7b71185bfa68270e3d375053bc653bec"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.759036 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.782837 4856 generic.go:334] "Generic (PLEG): container finished" podID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerID="216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e" exitCode=0 Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.782910 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" event={"ID":"f09f71ba-b703-44fe-926d-7ea32c11c4c7","Type":"ContainerDied","Data":"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.782945 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" event={"ID":"f09f71ba-b703-44fe-926d-7ea32c11c4c7","Type":"ContainerDied","Data":"6b604876ac752507b79b7a0fa00e632f3ec0ef2b82405228cd79d782d9b80eb3"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.782962 4856 scope.go:117] "RemoveContainer" containerID="216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.782970 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-586bdc5f9-z8bwn" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.783999 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.788203 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerStarted","Data":"a9aa0ace58f4d55b503d781dd43d4983ad8cbc31b36675ba56fa2bddfba479db"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.788235 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerStarted","Data":"39d74692a61d64ac3b4733a0a061eddb4db4c4aede15d1da75d20e1e1dc827fa"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.788557 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.788590 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.794470 4856 generic.go:334] "Generic (PLEG): container finished" podID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" containerID="d4e83c7e55d06ef9f941773b2cc05383cb420dbb1ab91bb67e023a782b4f7d2a" exitCode=0 Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.794569 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" event={"ID":"e5c321b2-351b-4a8e-afe7-2a0345dc4112","Type":"ContainerDied","Data":"d4e83c7e55d06ef9f941773b2cc05383cb420dbb1ab91bb67e023a782b4f7d2a"} Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.801831 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config" (OuterVolumeSpecName: "config") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.805642 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.808246 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f09f71ba-b703-44fe-926d-7ea32c11c4c7" (UID: "f09f71ba-b703-44fe-926d-7ea32c11c4c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.832605 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.832643 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.832654 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.832666 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.832679 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f09f71ba-b703-44fe-926d-7ea32c11c4c7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.842864 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6c47c6db4b-7s8m7" podStartSLOduration=2.842841462 podStartE2EDuration="2.842841462s" podCreationTimestamp="2026-03-20 13:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:04.807370415 +0000 UTC m=+1379.688396565" watchObservedRunningTime="2026-03-20 13:46:04.842841462 +0000 UTC m=+1379.723867592" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.874889 4856 scope.go:117] "RemoveContainer" containerID="02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.955500 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.983173 4856 scope.go:117] "RemoveContainer" containerID="216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e" Mar 20 13:46:04 crc kubenswrapper[4856]: E0320 13:46:04.983560 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e\": container with ID starting with 216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e not found: ID does not exist" containerID="216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.983625 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e"} err="failed to get container status \"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e\": rpc error: code = NotFound desc = could not find container \"216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e\": container with ID starting with 216a2a9e4553dbd934f5258941cbac9d25d92fcc6567b7946b9c1274b4010f7e not found: ID does not exist" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.983653 4856 scope.go:117] "RemoveContainer" containerID="02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331" Mar 20 13:46:04 crc kubenswrapper[4856]: E0320 13:46:04.986847 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331\": container with ID starting with 02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331 not found: ID does not exist" containerID="02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331" Mar 20 13:46:04 crc kubenswrapper[4856]: I0320 13:46:04.986879 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331"} err="failed to get container status \"02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331\": rpc error: code = NotFound desc = could not find container \"02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331\": container with ID starting with 02b652b545b1668dcebeb7dd8ba5d53f8945046d48f385ca79a90a47ca55d331 not found: ID does not exist" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.197449 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:05 crc kubenswrapper[4856]: E0320 13:46:05.200166 4856 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 13:46:05 crc kubenswrapper[4856]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/e5c321b2-351b-4a8e-afe7-2a0345dc4112/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:46:05 crc kubenswrapper[4856]: > podSandboxID="5db82149abdb45516144f28199446f9c52236663bba5ea356b685996008cc808" Mar 20 13:46:05 crc kubenswrapper[4856]: E0320 13:46:05.200397 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:46:05 crc kubenswrapper[4856]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n7ch57ch5c5hcch589hf7h577h659h96h5c8h5b4h55fhbbh667h565h5bchcbh58dh7dh5bch586h56ch574h598h67dh5c8h56dh8bh574h564hbch7q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-swift-storage-0,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-swift-storage-0,SubPath:dns-swift-storage-0,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2ldw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-85ff748b95-8zqrj_openstack(e5c321b2-351b-4a8e-afe7-2a0345dc4112): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/e5c321b2-351b-4a8e-afe7-2a0345dc4112/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:46:05 crc kubenswrapper[4856]: > logger="UnhandledError" Mar 20 13:46:05 crc kubenswrapper[4856]: E0320 13:46:05.201809 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/e5c321b2-351b-4a8e-afe7-2a0345dc4112/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" podUID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.292600 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.295242 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.304397 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-586bdc5f9-z8bwn"] Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.329509 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.459947 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddff9\" (UniqueName: \"kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9\") pod \"c376281a-ae9c-4057-a9ac-1ef731747830\" (UID: \"c376281a-ae9c-4057-a9ac-1ef731747830\") " Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.464849 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9" (OuterVolumeSpecName: "kube-api-access-ddff9") pod "c376281a-ae9c-4057-a9ac-1ef731747830" (UID: "c376281a-ae9c-4057-a9ac-1ef731747830"). InnerVolumeSpecName "kube-api-access-ddff9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.563815 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddff9\" (UniqueName: \"kubernetes.io/projected/c376281a-ae9c-4057-a9ac-1ef731747830-kube-api-access-ddff9\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.808259 4856 generic.go:334] "Generic (PLEG): container finished" podID="5e146b98-057f-467e-994a-1fabff7911bd" containerID="bd7558b32a18f228234bf348b4f419616de27672d023997320562223c2465641" exitCode=0 Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.808345 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" event={"ID":"5e146b98-057f-467e-994a-1fabff7911bd","Type":"ContainerDied","Data":"bd7558b32a18f228234bf348b4f419616de27672d023997320562223c2465641"} Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.808371 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" event={"ID":"5e146b98-057f-467e-994a-1fabff7911bd","Type":"ContainerStarted","Data":"da87cd94f4587634846a8c3ab62eba016273b3935570b5e97d88e8cf2b3d85fa"} Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.815013 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerStarted","Data":"5116894d5ecc376b3367021e897c812bb452f471afe5f24fa250ca4d4271a1e2"} Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.820234 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-llv2r" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.849244 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" path="/var/lib/kubelet/pods/f09f71ba-b703-44fe-926d-7ea32c11c4c7/volumes" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.849896 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-llv2r" event={"ID":"c376281a-ae9c-4057-a9ac-1ef731747830","Type":"ContainerDied","Data":"b236289044dbb8db1406851510909755dc190cc5c9a9c8fcad8353ff77549c79"} Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.849916 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b236289044dbb8db1406851510909755dc190cc5c9a9c8fcad8353ff77549c79" Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.849926 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerStarted","Data":"da17536f535d282d0bca282cda79fbbf3f258acd4baed924dc53a10f6c860a3a"} Mar 20 13:46:05 crc kubenswrapper[4856]: I0320 13:46:05.967210 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67975c5cc6-2d96h" podStartSLOduration=4.967190395 podStartE2EDuration="4.967190395s" podCreationTimestamp="2026-03-20 13:46:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:05.941150259 +0000 UTC m=+1380.822176389" watchObservedRunningTime="2026-03-20 13:46:05.967190395 +0000 UTC m=+1380.848216525" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.408326 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-7pp6m"] Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.425686 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-7pp6m"] Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.462552 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.594967 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.594999 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.595057 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.595093 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.595160 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.595200 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2ldw\" (UniqueName: \"kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw\") pod \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\" (UID: \"e5c321b2-351b-4a8e-afe7-2a0345dc4112\") " Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.620726 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw" (OuterVolumeSpecName: "kube-api-access-g2ldw") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "kube-api-access-g2ldw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.652871 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.697022 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.697286 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2ldw\" (UniqueName: \"kubernetes.io/projected/e5c321b2-351b-4a8e-afe7-2a0345dc4112-kube-api-access-g2ldw\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.715000 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.722790 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.732383 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.735166 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config" (OuterVolumeSpecName: "config") pod "e5c321b2-351b-4a8e-afe7-2a0345dc4112" (UID: "e5c321b2-351b-4a8e-afe7-2a0345dc4112"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.798704 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.799541 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.799641 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.799733 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5c321b2-351b-4a8e-afe7-2a0345dc4112-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.874186 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerStarted","Data":"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8"} Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.877293 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" event={"ID":"5e146b98-057f-467e-994a-1fabff7911bd","Type":"ContainerStarted","Data":"0b2aff424ce009b50e72675ea6c6d94b540a7fe671e015a60a1c1a25e9891713"} Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.877393 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.884553 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" event={"ID":"e5c321b2-351b-4a8e-afe7-2a0345dc4112","Type":"ContainerDied","Data":"5db82149abdb45516144f28199446f9c52236663bba5ea356b685996008cc808"} Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.884586 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-8zqrj" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.884712 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.885213 4856 scope.go:117] "RemoveContainer" containerID="d4e83c7e55d06ef9f941773b2cc05383cb420dbb1ab91bb67e023a782b4f7d2a" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.903461 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" podStartSLOduration=2.903440629 podStartE2EDuration="2.903440629s" podCreationTimestamp="2026-03-20 13:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:06.900727794 +0000 UTC m=+1381.781753924" watchObservedRunningTime="2026-03-20 13:46:06.903440629 +0000 UTC m=+1381.784466759" Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.982167 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:06 crc kubenswrapper[4856]: I0320 13:46:06.996417 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-8zqrj"] Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.830003 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e4380ff-fdce-457f-a1cf-0a5ed46754a0" path="/var/lib/kubelet/pods/6e4380ff-fdce-457f-a1cf-0a5ed46754a0/volumes" Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.831211 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" path="/var/lib/kubelet/pods/e5c321b2-351b-4a8e-afe7-2a0345dc4112/volumes" Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.895183 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerStarted","Data":"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646"} Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.895231 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerStarted","Data":"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3"} Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.899910 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerStarted","Data":"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5"} Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.900252 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.924063 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.059466686 podStartE2EDuration="4.924041975s" podCreationTimestamp="2026-03-20 13:46:03 +0000 UTC" firstStartedPulling="2026-03-20 13:46:04.957227292 +0000 UTC m=+1379.838253422" lastFinishedPulling="2026-03-20 13:46:05.821802581 +0000 UTC m=+1380.702828711" observedRunningTime="2026-03-20 13:46:07.91769571 +0000 UTC m=+1382.798721850" watchObservedRunningTime="2026-03-20 13:46:07.924041975 +0000 UTC m=+1382.805068115" Mar 20 13:46:07 crc kubenswrapper[4856]: I0320 13:46:07.946680 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.946660428 podStartE2EDuration="3.946660428s" podCreationTimestamp="2026-03-20 13:46:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:07.938053711 +0000 UTC m=+1382.819079851" watchObservedRunningTime="2026-03-20 13:46:07.946660428 +0000 UTC m=+1382.827686558" Mar 20 13:46:08 crc kubenswrapper[4856]: I0320 13:46:08.000105 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.278530 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.406790 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:46:09 crc kubenswrapper[4856]: E0320 13:46:09.407317 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="init" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407343 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="init" Mar 20 13:46:09 crc kubenswrapper[4856]: E0320 13:46:09.407385 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" containerName="init" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407393 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" containerName="init" Mar 20 13:46:09 crc kubenswrapper[4856]: E0320 13:46:09.407408 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c376281a-ae9c-4057-a9ac-1ef731747830" containerName="oc" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407416 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c376281a-ae9c-4057-a9ac-1ef731747830" containerName="oc" Mar 20 13:46:09 crc kubenswrapper[4856]: E0320 13:46:09.407432 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="dnsmasq-dns" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407438 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="dnsmasq-dns" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407646 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c376281a-ae9c-4057-a9ac-1ef731747830" containerName="oc" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407675 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f09f71ba-b703-44fe-926d-7ea32c11c4c7" containerName="dnsmasq-dns" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.407694 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c321b2-351b-4a8e-afe7-2a0345dc4112" containerName="init" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.419062 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.429049 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.429231 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.430183 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559464 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559526 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559620 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559746 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559822 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.559993 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.560245 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd8l9\" (UniqueName: \"kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.661895 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.661965 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.662030 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.662068 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.662102 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.662153 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.662246 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd8l9\" (UniqueName: \"kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.668508 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.668862 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.673637 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.675982 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.677134 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.692064 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.707126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd8l9\" (UniqueName: \"kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9\") pod \"neutron-58f96446cc-blkvz\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.748671 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.926197 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api-log" containerID="cri-o://afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" gracePeriod=30 Mar 20 13:46:09 crc kubenswrapper[4856]: I0320 13:46:09.926728 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api" containerID="cri-o://480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" gracePeriod=30 Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.574585 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.630149 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.791661 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.791967 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792050 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792080 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792110 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792135 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfg6z\" (UniqueName: \"kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792180 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom\") pod \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\" (UID: \"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa\") " Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792304 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792521 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs" (OuterVolumeSpecName: "logs") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792794 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.792819 4856 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.800683 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts" (OuterVolumeSpecName: "scripts") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.814461 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.818850 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z" (OuterVolumeSpecName: "kube-api-access-wfg6z") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "kube-api-access-wfg6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.835519 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.895360 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.895391 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.895407 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfg6z\" (UniqueName: \"kubernetes.io/projected/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-kube-api-access-wfg6z\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.895418 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.950373 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data" (OuterVolumeSpecName: "config-data") pod "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" (UID: "78f6c802-52bb-401a-b2eb-4c0ed89e1bfa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951584 4856 generic.go:334] "Generic (PLEG): container finished" podID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerID="480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" exitCode=0 Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951616 4856 generic.go:334] "Generic (PLEG): container finished" podID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerID="afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" exitCode=143 Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951670 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerDied","Data":"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5"} Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951698 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerDied","Data":"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8"} Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951708 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"78f6c802-52bb-401a-b2eb-4c0ed89e1bfa","Type":"ContainerDied","Data":"da17536f535d282d0bca282cda79fbbf3f258acd4baed924dc53a10f6c860a3a"} Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951723 4856 scope.go:117] "RemoveContainer" containerID="480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.951884 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.959381 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerStarted","Data":"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30"} Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.959431 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerStarted","Data":"17bd504c851f60275e8b0faecdf7f32f0ba8f22f56e42fd3fd249e92deef7e0d"} Mar 20 13:46:10 crc kubenswrapper[4856]: I0320 13:46:10.999532 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.027771 4856 scope.go:117] "RemoveContainer" containerID="afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.035410 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.045329 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.078329 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:11 crc kubenswrapper[4856]: E0320 13:46:11.078747 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.078764 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api" Mar 20 13:46:11 crc kubenswrapper[4856]: E0320 13:46:11.078784 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api-log" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.078793 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api-log" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.078955 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.078986 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" containerName="cinder-api-log" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.080089 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.083743 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.085574 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.085903 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.086313 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.111478 4856 scope.go:117] "RemoveContainer" containerID="480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" Mar 20 13:46:11 crc kubenswrapper[4856]: E0320 13:46:11.117556 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5\": container with ID starting with 480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5 not found: ID does not exist" containerID="480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.117610 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5"} err="failed to get container status \"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5\": rpc error: code = NotFound desc = could not find container \"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5\": container with ID starting with 480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5 not found: ID does not exist" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.117638 4856 scope.go:117] "RemoveContainer" containerID="afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" Mar 20 13:46:11 crc kubenswrapper[4856]: E0320 13:46:11.118995 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8\": container with ID starting with afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8 not found: ID does not exist" containerID="afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.119018 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8"} err="failed to get container status \"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8\": rpc error: code = NotFound desc = could not find container \"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8\": container with ID starting with afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8 not found: ID does not exist" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.119033 4856 scope.go:117] "RemoveContainer" containerID="480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.121426 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5"} err="failed to get container status \"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5\": rpc error: code = NotFound desc = could not find container \"480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5\": container with ID starting with 480cdb32d35e2901076907d825b93f5aaf78c7c47361b68164fd3878bc7bb5c5 not found: ID does not exist" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.121454 4856 scope.go:117] "RemoveContainer" containerID="afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.126444 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8"} err="failed to get container status \"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8\": rpc error: code = NotFound desc = could not find container \"afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8\": container with ID starting with afa37da117f1e8162cd0bd80c8643b6b0228a71b2d40d1e2bae3dee3ffbbe3a8 not found: ID does not exist" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203239 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203297 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203332 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203353 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhtx\" (UniqueName: \"kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203416 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203441 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203465 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203482 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.203525 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.305285 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.305587 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.305616 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.305661 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.305689 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhtx\" (UniqueName: \"kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306138 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306449 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306493 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306522 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306547 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.306646 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.312063 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.312106 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.313391 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.313714 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.314598 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.319323 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.324503 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhtx\" (UniqueName: \"kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx\") pod \"cinder-api-0\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.346167 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.384404 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.410136 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.462688 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.832300 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78f6c802-52bb-401a-b2eb-4c0ed89e1bfa" path="/var/lib/kubelet/pods/78f6c802-52bb-401a-b2eb-4c0ed89e1bfa/volumes" Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.973171 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerStarted","Data":"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53"} Mar 20 13:46:11 crc kubenswrapper[4856]: I0320 13:46:11.974521 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:12 crc kubenswrapper[4856]: I0320 13:46:12.056681 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:12 crc kubenswrapper[4856]: I0320 13:46:12.118419 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-58f96446cc-blkvz" podStartSLOduration=3.118396733 podStartE2EDuration="3.118396733s" podCreationTimestamp="2026-03-20 13:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:12.060732084 +0000 UTC m=+1386.941758214" watchObservedRunningTime="2026-03-20 13:46:12.118396733 +0000 UTC m=+1386.999422863" Mar 20 13:46:12 crc kubenswrapper[4856]: I0320 13:46:12.125369 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:12 crc kubenswrapper[4856]: I0320 13:46:12.417882 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:13 crc kubenswrapper[4856]: I0320 13:46:13.022818 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerStarted","Data":"98f8a872358d11589374e68a262e864329cd9ae8329df4f8a4f3f630a5b9881f"} Mar 20 13:46:13 crc kubenswrapper[4856]: I0320 13:46:13.023207 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerStarted","Data":"4951677a866d1ded7fb56ec01c6b45fae7a6b5bb13d0240b9af5a5755413fb4f"} Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.034210 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerStarted","Data":"609ee66b8ffd12f755f76cd9565ffad47cf29b376b4955ce6c3cee2c6b3c9d01"} Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.034639 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.053466 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.053446932 podStartE2EDuration="3.053446932s" podCreationTimestamp="2026-03-20 13:46:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:14.052493215 +0000 UTC m=+1388.933519355" watchObservedRunningTime="2026-03-20 13:46:14.053446932 +0000 UTC m=+1388.934473062" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.440829 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.513378 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.553760 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.580416 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.635933 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.636183 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="dnsmasq-dns" containerID="cri-o://8fd0f2fd98f2558fe316a2a01472241f92264ffd1755b490daf56737d3b1d1e5" gracePeriod=10 Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.875592 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.934922 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.935141 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b84bbc586-zmdmq" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api-log" containerID="cri-o://111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d" gracePeriod=30 Mar 20 13:46:14 crc kubenswrapper[4856]: I0320 13:46:14.935594 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-b84bbc586-zmdmq" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api" containerID="cri-o://2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5" gracePeriod=30 Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.075400 4856 generic.go:334] "Generic (PLEG): container finished" podID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerID="8fd0f2fd98f2558fe316a2a01472241f92264ffd1755b490daf56737d3b1d1e5" exitCode=0 Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.075556 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" event={"ID":"0a15bbae-4b61-484e-a95f-e5de1b17650b","Type":"ContainerDied","Data":"8fd0f2fd98f2558fe316a2a01472241f92264ffd1755b490daf56737d3b1d1e5"} Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.075638 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="cinder-scheduler" containerID="cri-o://3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646" gracePeriod=30 Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.075752 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="probe" containerID="cri-o://e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3" gracePeriod=30 Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.168587 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361036 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361129 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361187 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrqt\" (UniqueName: \"kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361325 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361361 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.361471 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config\") pod \"0a15bbae-4b61-484e-a95f-e5de1b17650b\" (UID: \"0a15bbae-4b61-484e-a95f-e5de1b17650b\") " Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.388724 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt" (OuterVolumeSpecName: "kube-api-access-czrqt") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "kube-api-access-czrqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: E0320 13:46:15.437553 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc376281a_ae9c_4057_a9ac_1ef731747830.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.451005 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.463492 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config" (OuterVolumeSpecName: "config") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.464149 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrqt\" (UniqueName: \"kubernetes.io/projected/0a15bbae-4b61-484e-a95f-e5de1b17650b-kube-api-access-czrqt\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.464266 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.464381 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.492914 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:15 crc kubenswrapper[4856]: E0320 13:46:15.493420 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="dnsmasq-dns" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.493437 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="dnsmasq-dns" Mar 20 13:46:15 crc kubenswrapper[4856]: E0320 13:46:15.493461 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="init" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.493469 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="init" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.494146 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" containerName="dnsmasq-dns" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.494774 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.494967 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.497805 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.498566 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-fvchk" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.498700 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.502662 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.540309 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.569778 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.569819 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.593090 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0a15bbae-4b61-484e-a95f-e5de1b17650b" (UID: "0a15bbae-4b61-484e-a95f-e5de1b17650b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.671188 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk6nq\" (UniqueName: \"kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.671351 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.671492 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.671591 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.671719 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0a15bbae-4b61-484e-a95f-e5de1b17650b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.772899 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk6nq\" (UniqueName: \"kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.772990 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.773081 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.773133 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.773749 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.777715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.777716 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.799960 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk6nq\" (UniqueName: \"kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq\") pod \"openstackclient\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " pod="openstack/openstackclient" Mar 20 13:46:15 crc kubenswrapper[4856]: I0320 13:46:15.826173 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.085823 4856 generic.go:334] "Generic (PLEG): container finished" podID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerID="111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d" exitCode=143 Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.085912 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerDied","Data":"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d"} Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.088825 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" event={"ID":"0a15bbae-4b61-484e-a95f-e5de1b17650b","Type":"ContainerDied","Data":"67d9c9c10dcbbb8c5b2905f5fad89995dd7a9cdc31aa39dcdac06e229b24acaa"} Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.088890 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2bz5n" Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.088949 4856 scope.go:117] "RemoveContainer" containerID="8fd0f2fd98f2558fe316a2a01472241f92264ffd1755b490daf56737d3b1d1e5" Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.112705 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.115246 4856 scope.go:117] "RemoveContainer" containerID="c99856c3c75f033c60969b18882bfc9a84ca71e63f81c5ca1ae48fe708661ec7" Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.123223 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2bz5n"] Mar 20 13:46:16 crc kubenswrapper[4856]: W0320 13:46:16.592404 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72c24034_7d59_49d7_b3e2_16d875f99bec.slice/crio-59b78ecd9f164de5955dc0bf42ea679c180423d963016925d4ea3332bce8cf37 WatchSource:0}: Error finding container 59b78ecd9f164de5955dc0bf42ea679c180423d963016925d4ea3332bce8cf37: Status 404 returned error can't find the container with id 59b78ecd9f164de5955dc0bf42ea679c180423d963016925d4ea3332bce8cf37 Mar 20 13:46:16 crc kubenswrapper[4856]: I0320 13:46:16.593434 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:17 crc kubenswrapper[4856]: I0320 13:46:17.099457 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"72c24034-7d59-49d7-b3e2-16d875f99bec","Type":"ContainerStarted","Data":"59b78ecd9f164de5955dc0bf42ea679c180423d963016925d4ea3332bce8cf37"} Mar 20 13:46:17 crc kubenswrapper[4856]: I0320 13:46:17.103051 4856 generic.go:334] "Generic (PLEG): container finished" podID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerID="e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3" exitCode=0 Mar 20 13:46:17 crc kubenswrapper[4856]: I0320 13:46:17.103090 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerDied","Data":"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3"} Mar 20 13:46:17 crc kubenswrapper[4856]: I0320 13:46:17.855785 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a15bbae-4b61-484e-a95f-e5de1b17650b" path="/var/lib/kubelet/pods/0a15bbae-4b61-484e-a95f-e5de1b17650b/volumes" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.520322 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.628139 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle\") pod \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.628391 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data\") pod \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.628432 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom\") pod \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.628629 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88kj\" (UniqueName: \"kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj\") pod \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.628747 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs\") pod \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\" (UID: \"76dd8778-4e86-4164-8d2c-fba1fd6509cd\") " Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.629286 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs" (OuterVolumeSpecName: "logs") pod "76dd8778-4e86-4164-8d2c-fba1fd6509cd" (UID: "76dd8778-4e86-4164-8d2c-fba1fd6509cd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.634351 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "76dd8778-4e86-4164-8d2c-fba1fd6509cd" (UID: "76dd8778-4e86-4164-8d2c-fba1fd6509cd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.653588 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj" (OuterVolumeSpecName: "kube-api-access-n88kj") pod "76dd8778-4e86-4164-8d2c-fba1fd6509cd" (UID: "76dd8778-4e86-4164-8d2c-fba1fd6509cd"). InnerVolumeSpecName "kube-api-access-n88kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.671635 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76dd8778-4e86-4164-8d2c-fba1fd6509cd" (UID: "76dd8778-4e86-4164-8d2c-fba1fd6509cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.702548 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data" (OuterVolumeSpecName: "config-data") pod "76dd8778-4e86-4164-8d2c-fba1fd6509cd" (UID: "76dd8778-4e86-4164-8d2c-fba1fd6509cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.730758 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.730799 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.730812 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/76dd8778-4e86-4164-8d2c-fba1fd6509cd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.730824 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88kj\" (UniqueName: \"kubernetes.io/projected/76dd8778-4e86-4164-8d2c-fba1fd6509cd-kube-api-access-n88kj\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4856]: I0320 13:46:18.730837 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76dd8778-4e86-4164-8d2c-fba1fd6509cd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.047100 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.149841 4856 generic.go:334] "Generic (PLEG): container finished" podID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerID="3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646" exitCode=0 Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.149945 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.149951 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerDied","Data":"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646"} Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.150602 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"9bc68aa9-5f25-41e6-8597-04eb935b7511","Type":"ContainerDied","Data":"5116894d5ecc376b3367021e897c812bb452f471afe5f24fa250ca4d4271a1e2"} Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.150634 4856 scope.go:117] "RemoveContainer" containerID="e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.153519 4856 generic.go:334] "Generic (PLEG): container finished" podID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerID="2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5" exitCode=0 Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.153624 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerDied","Data":"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5"} Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.153702 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b84bbc586-zmdmq" event={"ID":"76dd8778-4e86-4164-8d2c-fba1fd6509cd","Type":"ContainerDied","Data":"c41f4f77a6dc9ae63974f56e918d9a950963a16f2a534230a5cc670d2791d92c"} Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.153837 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b84bbc586-zmdmq" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.172951 4856 scope.go:117] "RemoveContainer" containerID="3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.195008 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.207950 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-b84bbc586-zmdmq"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.212351 4856 scope.go:117] "RemoveContainer" containerID="e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.212964 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3\": container with ID starting with e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3 not found: ID does not exist" containerID="e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.213157 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3"} err="failed to get container status \"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3\": rpc error: code = NotFound desc = could not find container \"e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3\": container with ID starting with e4d57cc11b5c70959e11d23078ae1d44fa8edfdabd3e4619690e322e7c17e2e3 not found: ID does not exist" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.213352 4856 scope.go:117] "RemoveContainer" containerID="3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.213920 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646\": container with ID starting with 3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646 not found: ID does not exist" containerID="3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.214059 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646"} err="failed to get container status \"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646\": rpc error: code = NotFound desc = could not find container \"3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646\": container with ID starting with 3323881319c76116dccbce7a8e23919c894d0fb5b4f128b99c73b707c1e9a646 not found: ID does not exist" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.214138 4856 scope.go:117] "RemoveContainer" containerID="2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.240837 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.241106 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.241219 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.241342 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.241558 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.241691 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv6nd\" (UniqueName: \"kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd\") pod \"9bc68aa9-5f25-41e6-8597-04eb935b7511\" (UID: \"9bc68aa9-5f25-41e6-8597-04eb935b7511\") " Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.246695 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.246903 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts" (OuterVolumeSpecName: "scripts") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.248787 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd" (OuterVolumeSpecName: "kube-api-access-xv6nd") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "kube-api-access-xv6nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.249964 4856 scope.go:117] "RemoveContainer" containerID="111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.251897 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.311392 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.344802 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.344857 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.344870 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.344880 4856 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9bc68aa9-5f25-41e6-8597-04eb935b7511-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.344890 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xv6nd\" (UniqueName: \"kubernetes.io/projected/9bc68aa9-5f25-41e6-8597-04eb935b7511-kube-api-access-xv6nd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.357919 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data" (OuterVolumeSpecName: "config-data") pod "9bc68aa9-5f25-41e6-8597-04eb935b7511" (UID: "9bc68aa9-5f25-41e6-8597-04eb935b7511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.424557 4856 scope.go:117] "RemoveContainer" containerID="2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.425515 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5\": container with ID starting with 2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5 not found: ID does not exist" containerID="2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.425551 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5"} err="failed to get container status \"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5\": rpc error: code = NotFound desc = could not find container \"2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5\": container with ID starting with 2abf173ba3abde37be9b0c512edd78b2a8f89262aa832f67129e76b74139c9a5 not found: ID does not exist" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.425577 4856 scope.go:117] "RemoveContainer" containerID="111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.427439 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d\": container with ID starting with 111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d not found: ID does not exist" containerID="111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.427468 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d"} err="failed to get container status \"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d\": rpc error: code = NotFound desc = could not find container \"111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d\": container with ID starting with 111ebc4b572b9a935efa1ee0b3227ff7a1d5679eb248298fdf53e103f225f79d not found: ID does not exist" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.446623 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc68aa9-5f25-41e6-8597-04eb935b7511-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.484394 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.492499 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.517385 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.517950 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="cinder-scheduler" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.517997 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="cinder-scheduler" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.518015 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="probe" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518023 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="probe" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.518037 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api-log" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518047 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api-log" Mar 20 13:46:19 crc kubenswrapper[4856]: E0320 13:46:19.518112 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518122 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518405 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="cinder-scheduler" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518429 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api-log" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518439 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" containerName="probe" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.518459 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" containerName="barbican-api" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.519729 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.525578 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.537380 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.650975 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.651043 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.651067 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.651196 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.651244 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnnl\" (UniqueName: \"kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.651290 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753448 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753574 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753588 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753642 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753664 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753758 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.753823 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnnl\" (UniqueName: \"kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.759952 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.760531 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.760916 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.772364 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnnl\" (UniqueName: \"kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.782040 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data\") pod \"cinder-scheduler-0\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.790911 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.792340 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.799254 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.800491 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.801974 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.802084 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.841545 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76dd8778-4e86-4164-8d2c-fba1fd6509cd" path="/var/lib/kubelet/pods/76dd8778-4e86-4164-8d2c-fba1fd6509cd/volumes" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.842182 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc68aa9-5f25-41e6-8597-04eb935b7511" path="/var/lib/kubelet/pods/9bc68aa9-5f25-41e6-8597-04eb935b7511/volumes" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.852523 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855756 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855807 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855839 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855900 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855927 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855953 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.855975 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.856013 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqr2d\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.957948 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958348 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958395 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958481 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958512 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958538 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958566 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.958623 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqr2d\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.959718 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.960063 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.963192 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.963701 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.969942 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.980564 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.980961 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:19 crc kubenswrapper[4856]: I0320 13:46:19.983949 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqr2d\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d\") pod \"swift-proxy-844867ddc-kgprv\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:20 crc kubenswrapper[4856]: I0320 13:46:20.124494 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:20 crc kubenswrapper[4856]: I0320 13:46:20.330703 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:20 crc kubenswrapper[4856]: I0320 13:46:20.739047 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:46:21 crc kubenswrapper[4856]: I0320 13:46:21.201884 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerStarted","Data":"c0500e5534d244ad7df38fc881c26294037eedb93e6d08de11d7ab3a2446cade"} Mar 20 13:46:21 crc kubenswrapper[4856]: I0320 13:46:21.202234 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerStarted","Data":"d8fd0d9d1348d14868db817e4088cc9ef731c82b6e610757db5f4ebb231384c6"} Mar 20 13:46:21 crc kubenswrapper[4856]: I0320 13:46:21.207429 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerStarted","Data":"0dba6d4b780a407897cd32686827c4483d0924be6ffdda04151d3f6aaee1a114"} Mar 20 13:46:21 crc kubenswrapper[4856]: I0320 13:46:21.207472 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerStarted","Data":"9fa9cf2114bf93f588f54c41845ed0f4c9ac74eb3965b3521d8055bcaaaa708a"} Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.225188 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerStarted","Data":"1b8689a9cd33b372d58dba21c6bfa7ae8b7939cff4da589d3f1a21739591dc55"} Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.225776 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.225800 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.229389 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerStarted","Data":"0537ae0b31b688d61e33527cfef4b3a828f73f588fd203119e3e0fc88c53d392"} Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.256765 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-844867ddc-kgprv" podStartSLOduration=3.256740911 podStartE2EDuration="3.256740911s" podCreationTimestamp="2026-03-20 13:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:22.251939518 +0000 UTC m=+1397.132965668" watchObservedRunningTime="2026-03-20 13:46:22.256740911 +0000 UTC m=+1397.137767041" Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.277933 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.277906554 podStartE2EDuration="3.277906554s" podCreationTimestamp="2026-03-20 13:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:22.270554781 +0000 UTC m=+1397.151580921" watchObservedRunningTime="2026-03-20 13:46:22.277906554 +0000 UTC m=+1397.158932684" Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.468132 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.468495 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-central-agent" containerID="cri-o://10f7a523c9e1e33739d4020c5e345bc2af5bb975ce991bba158047ae4527f9f8" gracePeriod=30 Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.468612 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="proxy-httpd" containerID="cri-o://bdcd8efb221765de3539a010acc5add27b2a30f836c7b9510e6bb6cadef93087" gracePeriod=30 Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.468626 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-notification-agent" containerID="cri-o://f53ba23f2a02629780d5285471dd16c556b6e3756e0594441d441e3306b91f4b" gracePeriod=30 Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.468596 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="sg-core" containerID="cri-o://3173f66c5384ed0cfbea760af3ab62b4bb30132511cf60e93b439257feb355cc" gracePeriod=30 Mar 20 13:46:22 crc kubenswrapper[4856]: I0320 13:46:22.477603 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.246822 4856 generic.go:334] "Generic (PLEG): container finished" podID="b88341df-93ef-4159-9587-0dad1dfee698" containerID="bdcd8efb221765de3539a010acc5add27b2a30f836c7b9510e6bb6cadef93087" exitCode=0 Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247127 4856 generic.go:334] "Generic (PLEG): container finished" podID="b88341df-93ef-4159-9587-0dad1dfee698" containerID="3173f66c5384ed0cfbea760af3ab62b4bb30132511cf60e93b439257feb355cc" exitCode=2 Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247137 4856 generic.go:334] "Generic (PLEG): container finished" podID="b88341df-93ef-4159-9587-0dad1dfee698" containerID="f53ba23f2a02629780d5285471dd16c556b6e3756e0594441d441e3306b91f4b" exitCode=0 Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247144 4856 generic.go:334] "Generic (PLEG): container finished" podID="b88341df-93ef-4159-9587-0dad1dfee698" containerID="10f7a523c9e1e33739d4020c5e345bc2af5bb975ce991bba158047ae4527f9f8" exitCode=0 Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.246909 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerDied","Data":"bdcd8efb221765de3539a010acc5add27b2a30f836c7b9510e6bb6cadef93087"} Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247420 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerDied","Data":"3173f66c5384ed0cfbea760af3ab62b4bb30132511cf60e93b439257feb355cc"} Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247459 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerDied","Data":"f53ba23f2a02629780d5285471dd16c556b6e3756e0594441d441e3306b91f4b"} Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.247472 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerDied","Data":"10f7a523c9e1e33739d4020c5e345bc2af5bb975ce991bba158047ae4527f9f8"} Mar 20 13:46:23 crc kubenswrapper[4856]: I0320 13:46:23.694016 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:46:24 crc kubenswrapper[4856]: I0320 13:46:24.852847 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:46:25 crc kubenswrapper[4856]: E0320 13:46:25.666069 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc376281a_ae9c_4057_a9ac_1ef731747830.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:46:26 crc kubenswrapper[4856]: I0320 13:46:26.913428 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.158:3000/\": dial tcp 10.217.0.158:3000: connect: connection refused" Mar 20 13:46:28 crc kubenswrapper[4856]: I0320 13:46:28.514729 4856 scope.go:117] "RemoveContainer" containerID="da5cecda24f74ee322b6814a218802e345f531ef689169072cc0d6b603a9d49b" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.253791 4856 scope.go:117] "RemoveContainer" containerID="bc3db4dd9b67dde3de860b22891ed04c026a2b4409012901bbf29b4c1ab5f56b" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.400152 4856 scope.go:117] "RemoveContainer" containerID="c03534640e2329a84ec3890ad41a2236b36338e5f6e852f1006406e4361e9894" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.513499 4856 scope.go:117] "RemoveContainer" containerID="357463eacd2c4e5f13045f3fee26166a450b0819176e2568471cf31e68a58020" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.588017 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768282 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768787 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768822 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768856 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8zsk\" (UniqueName: \"kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768886 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.768954 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.769134 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data\") pod \"b88341df-93ef-4159-9587-0dad1dfee698\" (UID: \"b88341df-93ef-4159-9587-0dad1dfee698\") " Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.770114 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.770422 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.773889 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts" (OuterVolumeSpecName: "scripts") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.774003 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk" (OuterVolumeSpecName: "kube-api-access-n8zsk") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "kube-api-access-n8zsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.802782 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.861657 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.867955 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data" (OuterVolumeSpecName: "config-data") pod "b88341df-93ef-4159-9587-0dad1dfee698" (UID: "b88341df-93ef-4159-9587-0dad1dfee698"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871314 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871342 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871355 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871365 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8zsk\" (UniqueName: \"kubernetes.io/projected/b88341df-93ef-4159-9587-0dad1dfee698-kube-api-access-n8zsk\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871378 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871390 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b88341df-93ef-4159-9587-0dad1dfee698-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:29 crc kubenswrapper[4856]: I0320 13:46:29.871400 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b88341df-93ef-4159-9587-0dad1dfee698-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.095303 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.136906 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.140474 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.335650 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b88341df-93ef-4159-9587-0dad1dfee698","Type":"ContainerDied","Data":"a57b5c7749b09efa46d59fa4d815e6d6f0558c836486b9542ecbe882b0cc094a"} Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.335972 4856 scope.go:117] "RemoveContainer" containerID="bdcd8efb221765de3539a010acc5add27b2a30f836c7b9510e6bb6cadef93087" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.335982 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.348438 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"72c24034-7d59-49d7-b3e2-16d875f99bec","Type":"ContainerStarted","Data":"31ce21f74e86925f967cb789712863c9c0a6a319a2909da14e6791c48d33e2e1"} Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.367983 4856 scope.go:117] "RemoveContainer" containerID="3173f66c5384ed0cfbea760af3ab62b4bb30132511cf60e93b439257feb355cc" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.372537 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.449837705 podStartE2EDuration="15.37251062s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.594775437 +0000 UTC m=+1391.475801567" lastFinishedPulling="2026-03-20 13:46:29.517448352 +0000 UTC m=+1404.398474482" observedRunningTime="2026-03-20 13:46:30.362482214 +0000 UTC m=+1405.243508374" watchObservedRunningTime="2026-03-20 13:46:30.37251062 +0000 UTC m=+1405.253536750" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.405785 4856 scope.go:117] "RemoveContainer" containerID="f53ba23f2a02629780d5285471dd16c556b6e3756e0594441d441e3306b91f4b" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.432227 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.451114 4856 scope.go:117] "RemoveContainer" containerID="10f7a523c9e1e33739d4020c5e345bc2af5bb975ce991bba158047ae4527f9f8" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.464278 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472018 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: E0320 13:46:30.472498 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="sg-core" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472517 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="sg-core" Mar 20 13:46:30 crc kubenswrapper[4856]: E0320 13:46:30.472538 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-central-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472545 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-central-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: E0320 13:46:30.472561 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-notification-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472567 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-notification-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: E0320 13:46:30.472577 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="proxy-httpd" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472583 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="proxy-httpd" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472732 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-notification-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472748 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="sg-core" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472760 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="ceilometer-central-agent" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.472769 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88341df-93ef-4159-9587-0dad1dfee698" containerName="proxy-httpd" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.474244 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.476133 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.476239 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.480380 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.607297 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: E0320 13:46:30.608021 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-2lmgv log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611005 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611357 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611552 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611621 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611794 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lmgv\" (UniqueName: \"kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611896 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.611968 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.638456 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.638666 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" containerName="kube-state-metrics" containerID="cri-o://33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55" gracePeriod=30 Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713597 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713682 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713711 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713859 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713933 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.713958 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.714669 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.714728 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.714814 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lmgv\" (UniqueName: \"kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.720284 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.720536 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.725507 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.726037 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4856]: I0320 13:46:30.735047 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lmgv\" (UniqueName: \"kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv\") pod \"ceilometer-0\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " pod="openstack/ceilometer-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.124417 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.224116 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l24l2\" (UniqueName: \"kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2\") pod \"e0d71c6e-58a3-48cb-8a06-564dafb339d4\" (UID: \"e0d71c6e-58a3-48cb-8a06-564dafb339d4\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.229296 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2" (OuterVolumeSpecName: "kube-api-access-l24l2") pod "e0d71c6e-58a3-48cb-8a06-564dafb339d4" (UID: "e0d71c6e-58a3-48cb-8a06-564dafb339d4"). InnerVolumeSpecName "kube-api-access-l24l2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.326396 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l24l2\" (UniqueName: \"kubernetes.io/projected/e0d71c6e-58a3-48cb-8a06-564dafb339d4-kube-api-access-l24l2\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.355184 4856 generic.go:334] "Generic (PLEG): container finished" podID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" containerID="33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55" exitCode=2 Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.355278 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.355285 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.356005 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0d71c6e-58a3-48cb-8a06-564dafb339d4","Type":"ContainerDied","Data":"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55"} Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.356055 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e0d71c6e-58a3-48cb-8a06-564dafb339d4","Type":"ContainerDied","Data":"f778ece17ed602f614d0484d671f0c237a94e47d5c3633a652924b63b46653d6"} Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.356076 4856 scope.go:117] "RemoveContainer" containerID="33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.364287 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.392094 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.394766 4856 scope.go:117] "RemoveContainer" containerID="33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55" Mar 20 13:46:31 crc kubenswrapper[4856]: E0320 13:46:31.396433 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55\": container with ID starting with 33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55 not found: ID does not exist" containerID="33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.396491 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55"} err="failed to get container status \"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55\": rpc error: code = NotFound desc = could not find container \"33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55\": container with ID starting with 33189cc1c76a7c475303bf68189e75a53d964844b6f45c4a6d3ef57176251c55 not found: ID does not exist" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.401612 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.429065 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:31 crc kubenswrapper[4856]: E0320 13:46:31.429564 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" containerName="kube-state-metrics" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.429590 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" containerName="kube-state-metrics" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.429818 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" containerName="kube-state-metrics" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.445249 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.450005 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.451450 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.451997 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530369 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530428 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530466 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lmgv\" (UniqueName: \"kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530533 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530595 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530631 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.530680 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data\") pod \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\" (UID: \"0eff5669-a6ea-461b-a0e6-4c1778a9e2c7\") " Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.531234 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.531419 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.533587 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.534929 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts" (OuterVolumeSpecName: "scripts") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.535508 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv" (OuterVolumeSpecName: "kube-api-access-2lmgv") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "kube-api-access-2lmgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.535800 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data" (OuterVolumeSpecName: "config-data") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.546734 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" (UID: "0eff5669-a6ea-461b-a0e6-4c1778a9e2c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.632725 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.632797 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.632910 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2k7\" (UniqueName: \"kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633003 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633235 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633256 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633272 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lmgv\" (UniqueName: \"kubernetes.io/projected/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-kube-api-access-2lmgv\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633304 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633316 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633327 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.633339 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.735243 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.735398 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.735419 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.735463 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2k7\" (UniqueName: \"kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.740336 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.741676 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.742981 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.747932 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-pjrhc"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.749111 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.769089 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pjrhc"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.771586 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2k7\" (UniqueName: \"kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7\") pod \"kube-state-metrics-0\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.828439 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88341df-93ef-4159-9587-0dad1dfee698" path="/var/lib/kubelet/pods/b88341df-93ef-4159-9587-0dad1dfee698/volumes" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.829512 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0d71c6e-58a3-48cb-8a06-564dafb339d4" path="/var/lib/kubelet/pods/e0d71c6e-58a3-48cb-8a06-564dafb339d4/volumes" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.855606 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-99lcm"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.856725 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.870736 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-99lcm"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.939183 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2dz\" (UniqueName: \"kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.939662 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.959964 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5b9c-account-create-update-wqlhb"] Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.961492 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.963661 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:46:31 crc kubenswrapper[4856]: I0320 13:46:31.983819 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-wqlhb"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.041860 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2dz\" (UniqueName: \"kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.041977 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.042012 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sqqv\" (UniqueName: \"kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.042107 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.043055 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.050130 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bv7js"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.051486 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.059196 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bv7js"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.068144 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.082517 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2dz\" (UniqueName: \"kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz\") pod \"nova-api-db-create-pjrhc\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.137175 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.148977 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h9r\" (UniqueName: \"kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.149149 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.149186 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sqqv\" (UniqueName: \"kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.149413 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.150214 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.179556 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sqqv\" (UniqueName: \"kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv\") pod \"nova-cell0-db-create-99lcm\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.187999 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db61-account-create-update-wg8jz"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.191506 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.197106 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.218413 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-wg8jz"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.250698 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.250975 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvspp\" (UniqueName: \"kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.251110 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h9r\" (UniqueName: \"kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.251287 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.252185 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.281438 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h9r\" (UniqueName: \"kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r\") pod \"nova-api-5b9c-account-create-update-wqlhb\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.292099 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.298271 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.363195 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.364231 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvspp\" (UniqueName: \"kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.364417 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5h26\" (UniqueName: \"kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.364668 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.367184 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.391527 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvspp\" (UniqueName: \"kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp\") pod \"nova-cell1-db-create-bv7js\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.396843 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9268-account-create-update-qhrq6"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.399751 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.400631 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.402958 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.408999 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-qhrq6"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.465510 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.465598 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzk6n\" (UniqueName: \"kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.465750 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.465815 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5h26\" (UniqueName: \"kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.466781 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.472559 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.496580 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5h26\" (UniqueName: \"kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26\") pod \"nova-cell0-db61-account-create-update-wg8jz\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.524340 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.548672 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.549111 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.552868 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.555042 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.563820 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.564061 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.564138 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.569624 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.569765 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzk6n\" (UniqueName: \"kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.571391 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.591027 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzk6n\" (UniqueName: \"kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n\") pod \"nova-cell1-9268-account-create-update-qhrq6\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.668109 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673145 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673289 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673586 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673697 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673781 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673824 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.673848 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb29k\" (UniqueName: \"kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.773560 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.775824 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.775857 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.775879 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb29k\" (UniqueName: \"kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.775902 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.775978 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.776000 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.776322 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-pjrhc"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.776035 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.777131 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.777588 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.780966 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.782483 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.782489 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.786288 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.795850 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb29k\" (UniqueName: \"kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k\") pod \"ceilometer-0\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.800966 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.887871 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:32 crc kubenswrapper[4856]: I0320 13:46:32.970464 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-wqlhb"] Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.028067 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.118583 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-99lcm"] Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.218912 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-wg8jz"] Mar 20 13:46:33 crc kubenswrapper[4856]: W0320 13:46:33.223854 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod976fa187_ddb7_4116_8476_fb55efdbe660.slice/crio-4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3 WatchSource:0}: Error finding container 4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3: Status 404 returned error can't find the container with id 4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3 Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.339688 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bv7js"] Mar 20 13:46:33 crc kubenswrapper[4856]: W0320 13:46:33.352358 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94f62824_29e7_4d19_afd3_43ddb0867654.slice/crio-16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8 WatchSource:0}: Error finding container 16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8: Status 404 returned error can't find the container with id 16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8 Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.425711 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" event={"ID":"2b8951ab-81c4-4f8f-8d45-f061c3a397da","Type":"ContainerStarted","Data":"f4db134aa1fb1676999455edc540061e8b9ce9c16ee9acfc003c7b44c61b255c"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.425768 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" event={"ID":"2b8951ab-81c4-4f8f-8d45-f061c3a397da","Type":"ContainerStarted","Data":"c45cfd0f9ca5cfe7eb1bc1fb69d7fcbb7ae2c62432b71b10490b34ddf86e3e7b"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.432726 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" event={"ID":"976fa187-ddb7-4116-8476-fb55efdbe660","Type":"ContainerStarted","Data":"4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.438621 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-99lcm" event={"ID":"5ac312f2-a405-42c5-980c-2791676ef7e0","Type":"ContainerStarted","Data":"c1f0b030b7dca10cc1d7c3a61a01145db30e251a15c3d2fe77e767dc544868e2"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.441298 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82ca80b6-bf8a-4741-a5e0-059f20fae69b","Type":"ContainerStarted","Data":"04bc4b2c7647c23ac5a613af742f84aa6b715cccf702fa9771892e53d2a1c079"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.443689 4856 generic.go:334] "Generic (PLEG): container finished" podID="885ead36-c4fe-42e5-8d15-95d1115cfcf4" containerID="47598b985bcb2355ceb6386186e9855bab5c284b8b5f4f8be5e0ef8201c62f0a" exitCode=0 Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.443761 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjrhc" event={"ID":"885ead36-c4fe-42e5-8d15-95d1115cfcf4","Type":"ContainerDied","Data":"47598b985bcb2355ceb6386186e9855bab5c284b8b5f4f8be5e0ef8201c62f0a"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.443787 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjrhc" event={"ID":"885ead36-c4fe-42e5-8d15-95d1115cfcf4","Type":"ContainerStarted","Data":"6350db39862701ad10a22d0f9f8496dc864a8b06cd5b361f098a801ac247531b"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.447386 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv7js" event={"ID":"94f62824-29e7-4d19-afd3-43ddb0867654","Type":"ContainerStarted","Data":"16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8"} Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.450216 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-qhrq6"] Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.458834 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" podStartSLOduration=2.458813763 podStartE2EDuration="2.458813763s" podCreationTimestamp="2026-03-20 13:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:33.43874811 +0000 UTC m=+1408.319774240" watchObservedRunningTime="2026-03-20 13:46:33.458813763 +0000 UTC m=+1408.339839903" Mar 20 13:46:33 crc kubenswrapper[4856]: W0320 13:46:33.541986 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64c33778_140d_48df_89fa_ec1719ae6f2d.slice/crio-24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f WatchSource:0}: Error finding container 24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f: Status 404 returned error can't find the container with id 24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.549211 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:33 crc kubenswrapper[4856]: W0320 13:46:33.571652 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc59c054_c214_4f61_b7dc_86d2f09b8b5e.slice/crio-637efcd3c75a4512bf1d9e5e0d1ef9bb3920b7ddfc6f8b957412d68c20096ffa WatchSource:0}: Error finding container 637efcd3c75a4512bf1d9e5e0d1ef9bb3920b7ddfc6f8b957412d68c20096ffa: Status 404 returned error can't find the container with id 637efcd3c75a4512bf1d9e5e0d1ef9bb3920b7ddfc6f8b957412d68c20096ffa Mar 20 13:46:33 crc kubenswrapper[4856]: I0320 13:46:33.861386 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eff5669-a6ea-461b-a0e6-4c1778a9e2c7" path="/var/lib/kubelet/pods/0eff5669-a6ea-461b-a0e6-4c1778a9e2c7/volumes" Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.457449 4856 generic.go:334] "Generic (PLEG): container finished" podID="2b8951ab-81c4-4f8f-8d45-f061c3a397da" containerID="f4db134aa1fb1676999455edc540061e8b9ce9c16ee9acfc003c7b44c61b255c" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.457537 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" event={"ID":"2b8951ab-81c4-4f8f-8d45-f061c3a397da","Type":"ContainerDied","Data":"f4db134aa1fb1676999455edc540061e8b9ce9c16ee9acfc003c7b44c61b255c"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.459327 4856 generic.go:334] "Generic (PLEG): container finished" podID="976fa187-ddb7-4116-8476-fb55efdbe660" containerID="c911729d76155581835fd96d5ba0fd73a56408242fdd2ebb71949a54c4f368f8" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.459366 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" event={"ID":"976fa187-ddb7-4116-8476-fb55efdbe660","Type":"ContainerDied","Data":"c911729d76155581835fd96d5ba0fd73a56408242fdd2ebb71949a54c4f368f8"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.460923 4856 generic.go:334] "Generic (PLEG): container finished" podID="64c33778-140d-48df-89fa-ec1719ae6f2d" containerID="ea3e57786dd5d593272c7769d78753522ecbf8651c0d96f5c6236f1c41bd02ec" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.461006 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" event={"ID":"64c33778-140d-48df-89fa-ec1719ae6f2d","Type":"ContainerDied","Data":"ea3e57786dd5d593272c7769d78753522ecbf8651c0d96f5c6236f1c41bd02ec"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.461029 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" event={"ID":"64c33778-140d-48df-89fa-ec1719ae6f2d","Type":"ContainerStarted","Data":"24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.462149 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerStarted","Data":"637efcd3c75a4512bf1d9e5e0d1ef9bb3920b7ddfc6f8b957412d68c20096ffa"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.463759 4856 generic.go:334] "Generic (PLEG): container finished" podID="5ac312f2-a405-42c5-980c-2791676ef7e0" containerID="65ed668c045bd2fb920f6ef6b0e53b5aa27ea56d88a168862d0d47b73502ce27" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.463830 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-99lcm" event={"ID":"5ac312f2-a405-42c5-980c-2791676ef7e0","Type":"ContainerDied","Data":"65ed668c045bd2fb920f6ef6b0e53b5aa27ea56d88a168862d0d47b73502ce27"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.465317 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82ca80b6-bf8a-4741-a5e0-059f20fae69b","Type":"ContainerStarted","Data":"b6968ce7333be8a128c62fc951855092f3670f6f1c40b19c0d3cebdfeb3e52d2"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.465565 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.466671 4856 generic.go:334] "Generic (PLEG): container finished" podID="94f62824-29e7-4d19-afd3-43ddb0867654" containerID="19e3e1f0478c0cdbf16f45c422cf145e176facf22ca8b861bee509403ecaf61f" exitCode=0 Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.466851 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv7js" event={"ID":"94f62824-29e7-4d19-afd3-43ddb0867654","Type":"ContainerDied","Data":"19e3e1f0478c0cdbf16f45c422cf145e176facf22ca8b861bee509403ecaf61f"} Mar 20 13:46:34 crc kubenswrapper[4856]: I0320 13:46:34.537348 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=3.10259663 podStartE2EDuration="3.537326333s" podCreationTimestamp="2026-03-20 13:46:31 +0000 UTC" firstStartedPulling="2026-03-20 13:46:32.841346198 +0000 UTC m=+1407.722372328" lastFinishedPulling="2026-03-20 13:46:33.276075901 +0000 UTC m=+1408.157102031" observedRunningTime="2026-03-20 13:46:34.528370147 +0000 UTC m=+1409.409396277" watchObservedRunningTime="2026-03-20 13:46:34.537326333 +0000 UTC m=+1409.418352483" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.004568 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.039630 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts\") pod \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.039804 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2dz\" (UniqueName: \"kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz\") pod \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\" (UID: \"885ead36-c4fe-42e5-8d15-95d1115cfcf4\") " Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.040259 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "885ead36-c4fe-42e5-8d15-95d1115cfcf4" (UID: "885ead36-c4fe-42e5-8d15-95d1115cfcf4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.045802 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz" (OuterVolumeSpecName: "kube-api-access-rk2dz") pod "885ead36-c4fe-42e5-8d15-95d1115cfcf4" (UID: "885ead36-c4fe-42e5-8d15-95d1115cfcf4"). InnerVolumeSpecName "kube-api-access-rk2dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.141734 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/885ead36-c4fe-42e5-8d15-95d1115cfcf4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.141774 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2dz\" (UniqueName: \"kubernetes.io/projected/885ead36-c4fe-42e5-8d15-95d1115cfcf4-kube-api-access-rk2dz\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.477125 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerStarted","Data":"da242efea8441fa92082673b2ef73f8d0d22d70066cb80c609ae34d05ef595cf"} Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.477498 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerStarted","Data":"efd6d46825ed6248aeeb346cdc60629b75b7ee308f46ca092d2d0880d00e924a"} Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.482534 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-pjrhc" Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.489304 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-pjrhc" event={"ID":"885ead36-c4fe-42e5-8d15-95d1115cfcf4","Type":"ContainerDied","Data":"6350db39862701ad10a22d0f9f8496dc864a8b06cd5b361f098a801ac247531b"} Mar 20 13:46:35 crc kubenswrapper[4856]: I0320 13:46:35.489639 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6350db39862701ad10a22d0f9f8496dc864a8b06cd5b361f098a801ac247531b" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.004637 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:36 crc kubenswrapper[4856]: E0320 13:46:36.049478 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc376281a_ae9c_4057_a9ac_1ef731747830.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.083539 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts\") pod \"5ac312f2-a405-42c5-980c-2791676ef7e0\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.083960 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sqqv\" (UniqueName: \"kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv\") pod \"5ac312f2-a405-42c5-980c-2791676ef7e0\" (UID: \"5ac312f2-a405-42c5-980c-2791676ef7e0\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.088835 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5ac312f2-a405-42c5-980c-2791676ef7e0" (UID: "5ac312f2-a405-42c5-980c-2791676ef7e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.093655 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv" (OuterVolumeSpecName: "kube-api-access-8sqqv") pod "5ac312f2-a405-42c5-980c-2791676ef7e0" (UID: "5ac312f2-a405-42c5-980c-2791676ef7e0"). InnerVolumeSpecName "kube-api-access-8sqqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.189209 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5ac312f2-a405-42c5-980c-2791676ef7e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.189256 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sqqv\" (UniqueName: \"kubernetes.io/projected/5ac312f2-a405-42c5-980c-2791676ef7e0-kube-api-access-8sqqv\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.225871 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.226144 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-log" containerID="cri-o://7297b820cc2bb8b7bd86556f1cb432b985e0eaab626721bdd32c58f8eec3968d" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.226715 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-httpd" containerID="cri-o://64e0034feb09eb16bc21de5cb0df509b5c8da8bd04ba03e79ab00d49d565ba4e" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.266709 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.276315 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.290211 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts\") pod \"94f62824-29e7-4d19-afd3-43ddb0867654\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.290324 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvspp\" (UniqueName: \"kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp\") pod \"94f62824-29e7-4d19-afd3-43ddb0867654\" (UID: \"94f62824-29e7-4d19-afd3-43ddb0867654\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.291941 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "94f62824-29e7-4d19-afd3-43ddb0867654" (UID: "94f62824-29e7-4d19-afd3-43ddb0867654"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.297001 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp" (OuterVolumeSpecName: "kube-api-access-jvspp") pod "94f62824-29e7-4d19-afd3-43ddb0867654" (UID: "94f62824-29e7-4d19-afd3-43ddb0867654"). InnerVolumeSpecName "kube-api-access-jvspp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.393043 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84h9r\" (UniqueName: \"kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r\") pod \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.393804 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts\") pod \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\" (UID: \"2b8951ab-81c4-4f8f-8d45-f061c3a397da\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.394353 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b8951ab-81c4-4f8f-8d45-f061c3a397da" (UID: "2b8951ab-81c4-4f8f-8d45-f061c3a397da"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.396071 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/94f62824-29e7-4d19-afd3-43ddb0867654-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.396191 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b8951ab-81c4-4f8f-8d45-f061c3a397da-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.396422 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvspp\" (UniqueName: \"kubernetes.io/projected/94f62824-29e7-4d19-afd3-43ddb0867654-kube-api-access-jvspp\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.398594 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r" (OuterVolumeSpecName: "kube-api-access-84h9r") pod "2b8951ab-81c4-4f8f-8d45-f061c3a397da" (UID: "2b8951ab-81c4-4f8f-8d45-f061c3a397da"). InnerVolumeSpecName "kube-api-access-84h9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.463916 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.472923 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.495005 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerStarted","Data":"e64d0e63f1b0a032ec9e2063bfd1cf962fa446f017a55b8b251af9db7727138f"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.498204 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts\") pod \"64c33778-140d-48df-89fa-ec1719ae6f2d\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.498285 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5h26\" (UniqueName: \"kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26\") pod \"976fa187-ddb7-4116-8476-fb55efdbe660\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.498460 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzk6n\" (UniqueName: \"kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n\") pod \"64c33778-140d-48df-89fa-ec1719ae6f2d\" (UID: \"64c33778-140d-48df-89fa-ec1719ae6f2d\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.498488 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts\") pod \"976fa187-ddb7-4116-8476-fb55efdbe660\" (UID: \"976fa187-ddb7-4116-8476-fb55efdbe660\") " Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.499024 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84h9r\" (UniqueName: \"kubernetes.io/projected/2b8951ab-81c4-4f8f-8d45-f061c3a397da-kube-api-access-84h9r\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.499399 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "976fa187-ddb7-4116-8476-fb55efdbe660" (UID: "976fa187-ddb7-4116-8476-fb55efdbe660"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.499742 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64c33778-140d-48df-89fa-ec1719ae6f2d" (UID: "64c33778-140d-48df-89fa-ec1719ae6f2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.502687 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-99lcm" event={"ID":"5ac312f2-a405-42c5-980c-2791676ef7e0","Type":"ContainerDied","Data":"c1f0b030b7dca10cc1d7c3a61a01145db30e251a15c3d2fe77e767dc544868e2"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.502754 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1f0b030b7dca10cc1d7c3a61a01145db30e251a15c3d2fe77e767dc544868e2" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.502937 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-99lcm" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.506515 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bv7js" event={"ID":"94f62824-29e7-4d19-afd3-43ddb0867654","Type":"ContainerDied","Data":"16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.506651 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16555620ba2fa85a93eace59004ea59a92b178dbe55e39fe77d38e1ff21d02d8" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.506536 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n" (OuterVolumeSpecName: "kube-api-access-lzk6n") pod "64c33778-140d-48df-89fa-ec1719ae6f2d" (UID: "64c33778-140d-48df-89fa-ec1719ae6f2d"). InnerVolumeSpecName "kube-api-access-lzk6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.506811 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bv7js" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.508410 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26" (OuterVolumeSpecName: "kube-api-access-w5h26") pod "976fa187-ddb7-4116-8476-fb55efdbe660" (UID: "976fa187-ddb7-4116-8476-fb55efdbe660"). InnerVolumeSpecName "kube-api-access-w5h26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.511169 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" event={"ID":"2b8951ab-81c4-4f8f-8d45-f061c3a397da","Type":"ContainerDied","Data":"c45cfd0f9ca5cfe7eb1bc1fb69d7fcbb7ae2c62432b71b10490b34ddf86e3e7b"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.511300 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c45cfd0f9ca5cfe7eb1bc1fb69d7fcbb7ae2c62432b71b10490b34ddf86e3e7b" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.511405 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-wqlhb" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.522710 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" event={"ID":"976fa187-ddb7-4116-8476-fb55efdbe660","Type":"ContainerDied","Data":"4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.522757 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4edf5e7f39f2d07c411984d313385ebe08329a1d41419bef777da166a404a2b3" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.522945 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-wg8jz" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.525197 4856 generic.go:334] "Generic (PLEG): container finished" podID="d5e3128e-5cf2-432f-b268-090de59c9722" containerID="7297b820cc2bb8b7bd86556f1cb432b985e0eaab626721bdd32c58f8eec3968d" exitCode=143 Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.525242 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerDied","Data":"7297b820cc2bb8b7bd86556f1cb432b985e0eaab626721bdd32c58f8eec3968d"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.527066 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" event={"ID":"64c33778-140d-48df-89fa-ec1719ae6f2d","Type":"ContainerDied","Data":"24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f"} Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.527095 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24e988ece05b88d38dd11311a5bbd63ad324941efc4a37efe0678cfa2f04721f" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.527220 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-qhrq6" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.600590 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64c33778-140d-48df-89fa-ec1719ae6f2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.600612 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5h26\" (UniqueName: \"kubernetes.io/projected/976fa187-ddb7-4116-8476-fb55efdbe660-kube-api-access-w5h26\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.600622 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzk6n\" (UniqueName: \"kubernetes.io/projected/64c33778-140d-48df-89fa-ec1719ae6f2d-kube-api-access-lzk6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4856]: I0320 13:46:36.600632 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/976fa187-ddb7-4116-8476-fb55efdbe660-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:37 crc kubenswrapper[4856]: I0320 13:46:37.303179 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:37 crc kubenswrapper[4856]: I0320 13:46:37.303691 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-httpd" containerID="cri-o://1b4227016e0c8a4697775639aa23ccd7ad77dbae18ac0cc519c71be221f4e243" gracePeriod=30 Mar 20 13:46:37 crc kubenswrapper[4856]: I0320 13:46:37.305580 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-log" containerID="cri-o://72c22eeafe724ec37dd78b92a7210b42a72b436fec39dc56fb830bb3d35f4900" gracePeriod=30 Mar 20 13:46:37 crc kubenswrapper[4856]: I0320 13:46:37.539568 4856 generic.go:334] "Generic (PLEG): container finished" podID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerID="72c22eeafe724ec37dd78b92a7210b42a72b436fec39dc56fb830bb3d35f4900" exitCode=143 Mar 20 13:46:37 crc kubenswrapper[4856]: I0320 13:46:37.539647 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerDied","Data":"72c22eeafe724ec37dd78b92a7210b42a72b436fec39dc56fb830bb3d35f4900"} Mar 20 13:46:38 crc kubenswrapper[4856]: I0320 13:46:38.554210 4856 generic.go:334] "Generic (PLEG): container finished" podID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerID="d072b94b02f585dd39b2d4673d047f5f462e395f1e13b0a460be47718d48dcd0" exitCode=1 Mar 20 13:46:38 crc kubenswrapper[4856]: I0320 13:46:38.554549 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerDied","Data":"d072b94b02f585dd39b2d4673d047f5f462e395f1e13b0a460be47718d48dcd0"} Mar 20 13:46:38 crc kubenswrapper[4856]: I0320 13:46:38.554688 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-central-agent" containerID="cri-o://efd6d46825ed6248aeeb346cdc60629b75b7ee308f46ca092d2d0880d00e924a" gracePeriod=30 Mar 20 13:46:38 crc kubenswrapper[4856]: I0320 13:46:38.555598 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="sg-core" containerID="cri-o://e64d0e63f1b0a032ec9e2063bfd1cf962fa446f017a55b8b251af9db7727138f" gracePeriod=30 Mar 20 13:46:38 crc kubenswrapper[4856]: I0320 13:46:38.555600 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-notification-agent" containerID="cri-o://da242efea8441fa92082673b2ef73f8d0d22d70066cb80c609ae34d05ef595cf" gracePeriod=30 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.571507 4856 generic.go:334] "Generic (PLEG): container finished" podID="d5e3128e-5cf2-432f-b268-090de59c9722" containerID="64e0034feb09eb16bc21de5cb0df509b5c8da8bd04ba03e79ab00d49d565ba4e" exitCode=0 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.571587 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerDied","Data":"64e0034feb09eb16bc21de5cb0df509b5c8da8bd04ba03e79ab00d49d565ba4e"} Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.575766 4856 generic.go:334] "Generic (PLEG): container finished" podID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerID="e64d0e63f1b0a032ec9e2063bfd1cf962fa446f017a55b8b251af9db7727138f" exitCode=2 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.575801 4856 generic.go:334] "Generic (PLEG): container finished" podID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerID="da242efea8441fa92082673b2ef73f8d0d22d70066cb80c609ae34d05ef595cf" exitCode=0 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.575798 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerDied","Data":"e64d0e63f1b0a032ec9e2063bfd1cf962fa446f017a55b8b251af9db7727138f"} Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.575855 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerDied","Data":"da242efea8441fa92082673b2ef73f8d0d22d70066cb80c609ae34d05ef595cf"} Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.769038 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.835235 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.835677 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67975c5cc6-2d96h" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-api" containerID="cri-o://e9d320f8afc636ad4d1326713391272c7b71185bfa68270e3d375053bc653bec" gracePeriod=30 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.835789 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-67975c5cc6-2d96h" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-httpd" containerID="cri-o://f1443a7d3d86f2f83dec5d6552b8e2ae4efe9881c4210ba9a130c8efc2674cfa" gracePeriod=30 Mar 20 13:46:39 crc kubenswrapper[4856]: I0320 13:46:39.952088 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.071659 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.071696 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.071720 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.071829 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.071879 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.072262 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs" (OuterVolumeSpecName: "logs") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.072361 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjwzj\" (UniqueName: \"kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.072390 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.072471 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs\") pod \"d5e3128e-5cf2-432f-b268-090de59c9722\" (UID: \"d5e3128e-5cf2-432f-b268-090de59c9722\") " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.072774 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.073129 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.073141 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d5e3128e-5cf2-432f-b268-090de59c9722-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.078509 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj" (OuterVolumeSpecName: "kube-api-access-cjwzj") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "kube-api-access-cjwzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.079400 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.081383 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts" (OuterVolumeSpecName: "scripts") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.108784 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.152796 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.152892 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data" (OuterVolumeSpecName: "config-data") pod "d5e3128e-5cf2-432f-b268-090de59c9722" (UID: "d5e3128e-5cf2-432f-b268-090de59c9722"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174753 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174794 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjwzj\" (UniqueName: \"kubernetes.io/projected/d5e3128e-5cf2-432f-b268-090de59c9722-kube-api-access-cjwzj\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174806 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174816 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174825 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.174832 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5e3128e-5cf2-432f-b268-090de59c9722-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.200406 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.276397 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.601414 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d5e3128e-5cf2-432f-b268-090de59c9722","Type":"ContainerDied","Data":"d8e2c6c0ee5be608519d178007e0dc8baba4371a461fcd2ca99b298885be7198"} Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.601460 4856 scope.go:117] "RemoveContainer" containerID="64e0034feb09eb16bc21de5cb0df509b5c8da8bd04ba03e79ab00d49d565ba4e" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.601572 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.615212 4856 generic.go:334] "Generic (PLEG): container finished" podID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerID="f1443a7d3d86f2f83dec5d6552b8e2ae4efe9881c4210ba9a130c8efc2674cfa" exitCode=0 Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.615293 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerDied","Data":"f1443a7d3d86f2f83dec5d6552b8e2ae4efe9881c4210ba9a130c8efc2674cfa"} Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.617582 4856 generic.go:334] "Generic (PLEG): container finished" podID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerID="1b4227016e0c8a4697775639aa23ccd7ad77dbae18ac0cc519c71be221f4e243" exitCode=0 Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.617608 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerDied","Data":"1b4227016e0c8a4697775639aa23ccd7ad77dbae18ac0cc519c71be221f4e243"} Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.648346 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.665030 4856 scope.go:117] "RemoveContainer" containerID="7297b820cc2bb8b7bd86556f1cb432b985e0eaab626721bdd32c58f8eec3968d" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.667480 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.677894 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681208 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac312f2-a405-42c5-980c-2791676ef7e0" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681234 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac312f2-a405-42c5-980c-2791676ef7e0" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681253 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64c33778-140d-48df-89fa-ec1719ae6f2d" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681263 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="64c33778-140d-48df-89fa-ec1719ae6f2d" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681303 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94f62824-29e7-4d19-afd3-43ddb0867654" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681312 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f62824-29e7-4d19-afd3-43ddb0867654" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681328 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-httpd" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681336 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-httpd" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681351 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="885ead36-c4fe-42e5-8d15-95d1115cfcf4" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681358 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="885ead36-c4fe-42e5-8d15-95d1115cfcf4" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681379 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-log" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681387 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-log" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681406 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="976fa187-ddb7-4116-8476-fb55efdbe660" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681414 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="976fa187-ddb7-4116-8476-fb55efdbe660" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: E0320 13:46:40.681429 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b8951ab-81c4-4f8f-8d45-f061c3a397da" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681437 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b8951ab-81c4-4f8f-8d45-f061c3a397da" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681633 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="885ead36-c4fe-42e5-8d15-95d1115cfcf4" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681652 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="976fa187-ddb7-4116-8476-fb55efdbe660" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681668 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="94f62824-29e7-4d19-afd3-43ddb0867654" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681683 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b8951ab-81c4-4f8f-8d45-f061c3a397da" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681694 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-httpd" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681705 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac312f2-a405-42c5-980c-2791676ef7e0" containerName="mariadb-database-create" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681722 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="64c33778-140d-48df-89fa-ec1719ae6f2d" containerName="mariadb-account-create-update" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.681736 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" containerName="glance-log" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.682889 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.685937 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.686114 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.750609 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.788867 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789201 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789230 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6skj\" (UniqueName: \"kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789273 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789322 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789340 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789383 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.789410 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.890963 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891036 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6skj\" (UniqueName: \"kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891094 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891130 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891154 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891197 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891228 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.891376 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.893242 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.893801 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.902523 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.906029 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.906039 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.906339 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.906955 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.924055 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6skj\" (UniqueName: \"kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:40 crc kubenswrapper[4856]: I0320 13:46:40.950841 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " pod="openstack/glance-default-external-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.007930 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.012990 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.095381 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.095665 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.095709 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.095831 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.095848 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdbh6\" (UniqueName: \"kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.096703 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.096787 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.096822 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.097081 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\" (UID: \"4b717ce0-8fd1-454d-910d-d663dbc1b07a\") " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.097543 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.099341 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs" (OuterVolumeSpecName: "logs") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.102295 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts" (OuterVolumeSpecName: "scripts") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.115805 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6" (OuterVolumeSpecName: "kube-api-access-cdbh6") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "kube-api-access-cdbh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.115827 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.139671 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.169806 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.181714 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data" (OuterVolumeSpecName: "config-data") pod "4b717ce0-8fd1-454d-910d-d663dbc1b07a" (UID: "4b717ce0-8fd1-454d-910d-d663dbc1b07a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205397 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205462 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205485 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdbh6\" (UniqueName: \"kubernetes.io/projected/4b717ce0-8fd1-454d-910d-d663dbc1b07a-kube-api-access-cdbh6\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205495 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205506 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b717ce0-8fd1-454d-910d-d663dbc1b07a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205546 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.205557 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b717ce0-8fd1-454d-910d-d663dbc1b07a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.228530 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.307437 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.605790 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:46:41 crc kubenswrapper[4856]: W0320 13:46:41.612412 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb65360e6_90a7_4a8e_8647_6239e7c52e5b.slice/crio-3be30f93f31705feb401b66300fbc86df2ab10e4f07b8260acddf826bc38ec74 WatchSource:0}: Error finding container 3be30f93f31705feb401b66300fbc86df2ab10e4f07b8260acddf826bc38ec74: Status 404 returned error can't find the container with id 3be30f93f31705feb401b66300fbc86df2ab10e4f07b8260acddf826bc38ec74 Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.626368 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerStarted","Data":"3be30f93f31705feb401b66300fbc86df2ab10e4f07b8260acddf826bc38ec74"} Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.628339 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4b717ce0-8fd1-454d-910d-d663dbc1b07a","Type":"ContainerDied","Data":"e5b8e9ed9ba604ee12eef9d71346c30c2d9fcff73f370a3cd12d6102aab9a844"} Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.628394 4856 scope.go:117] "RemoveContainer" containerID="1b4227016e0c8a4697775639aa23ccd7ad77dbae18ac0cc519c71be221f4e243" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.628536 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.666680 4856 scope.go:117] "RemoveContainer" containerID="72c22eeafe724ec37dd78b92a7210b42a72b436fec39dc56fb830bb3d35f4900" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.672543 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.691159 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.704273 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:41 crc kubenswrapper[4856]: E0320 13:46:41.706484 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-log" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.706499 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-log" Mar 20 13:46:41 crc kubenswrapper[4856]: E0320 13:46:41.706515 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-httpd" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.706520 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-httpd" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.706744 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-log" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.706771 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" containerName="glance-httpd" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.708598 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.713245 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.713490 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.719345 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.815600 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.815684 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.815810 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.815873 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.816092 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.816175 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgr56\" (UniqueName: \"kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.816199 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.816298 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.841337 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b717ce0-8fd1-454d-910d-d663dbc1b07a" path="/var/lib/kubelet/pods/4b717ce0-8fd1-454d-910d-d663dbc1b07a/volumes" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.842307 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e3128e-5cf2-432f-b268-090de59c9722" path="/var/lib/kubelet/pods/d5e3128e-5cf2-432f-b268-090de59c9722/volumes" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919155 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919210 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgr56\" (UniqueName: \"kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919237 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919312 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919352 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919425 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919451 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919479 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.919561 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.920245 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.920423 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.930334 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.933768 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.938919 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.938986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgr56\" (UniqueName: \"kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.944982 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:41 crc kubenswrapper[4856]: I0320 13:46:41.992858 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.040650 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.095505 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.502081 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dms79"] Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.503671 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.506718 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.506918 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qkrfk" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.506990 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.516394 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dms79"] Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.535248 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.535328 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.535389 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqk6\" (UniqueName: \"kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.535464 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.637176 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.637235 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.637343 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqk6\" (UniqueName: \"kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.637477 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.642766 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.644852 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.657781 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.661845 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqk6\" (UniqueName: \"kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6\") pod \"nova-cell0-conductor-db-sync-dms79\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.667755 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerStarted","Data":"eab1a972ced564d26d0363b491849a05a6f3e49a90ed55b28b5329a2b2fb593a"} Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.717483 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:46:42 crc kubenswrapper[4856]: I0320 13:46:42.824178 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.425776 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dms79"] Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.694821 4856 generic.go:334] "Generic (PLEG): container finished" podID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerID="efd6d46825ed6248aeeb346cdc60629b75b7ee308f46ca092d2d0880d00e924a" exitCode=0 Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.695110 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerDied","Data":"efd6d46825ed6248aeeb346cdc60629b75b7ee308f46ca092d2d0880d00e924a"} Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.697045 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerStarted","Data":"32c113b6f7715bc1e4450f3f402b1f997eacbd1f27bf889a91c9380bda22c42d"} Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.697109 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerStarted","Data":"4875c22fe0a8b278ce840b2edb9f1d4e9e6d4ba220a514f94ed92626dd49247a"} Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.698845 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerStarted","Data":"2e30bbf7e5c9ce212c4664a13de9d24567d97e9650bc51aebc00393f15f7368c"} Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.703993 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dms79" event={"ID":"131e451e-458e-4955-b559-b0eefb86cf25","Type":"ContainerStarted","Data":"4e42d47af8f7b3b048d5cb911fe8c57aff0818e405f79e953a6b265a7ac1d125"} Mar 20 13:46:43 crc kubenswrapper[4856]: I0320 13:46:43.764647 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.764627983 podStartE2EDuration="3.764627983s" podCreationTimestamp="2026-03-20 13:46:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:43.721626089 +0000 UTC m=+1418.602652239" watchObservedRunningTime="2026-03-20 13:46:43.764627983 +0000 UTC m=+1418.645654113" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.058469 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174343 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174434 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174506 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174524 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174545 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174597 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb29k\" (UniqueName: \"kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.174723 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd\") pod \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\" (UID: \"bc59c054-c214-4f61-b7dc-86d2f09b8b5e\") " Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.175799 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.176325 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.184015 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts" (OuterVolumeSpecName: "scripts") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.220441 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k" (OuterVolumeSpecName: "kube-api-access-xb29k") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "kube-api-access-xb29k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.237163 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.276889 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.277130 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.277227 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.277361 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb29k\" (UniqueName: \"kubernetes.io/projected/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-kube-api-access-xb29k\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.277452 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.285963 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.308509 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data" (OuterVolumeSpecName: "config-data") pod "bc59c054-c214-4f61-b7dc-86d2f09b8b5e" (UID: "bc59c054-c214-4f61-b7dc-86d2f09b8b5e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.379098 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.379805 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc59c054-c214-4f61-b7dc-86d2f09b8b5e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.714548 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.714516 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bc59c054-c214-4f61-b7dc-86d2f09b8b5e","Type":"ContainerDied","Data":"637efcd3c75a4512bf1d9e5e0d1ef9bb3920b7ddfc6f8b957412d68c20096ffa"} Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.714718 4856 scope.go:117] "RemoveContainer" containerID="d072b94b02f585dd39b2d4673d047f5f462e395f1e13b0a460be47718d48dcd0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.718305 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerStarted","Data":"5bb29030c4a50eae6ca1db03ef400e510392fd28217af8dc0f5c5c0444dfd46e"} Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.736101 4856 scope.go:117] "RemoveContainer" containerID="e64d0e63f1b0a032ec9e2063bfd1cf962fa446f017a55b8b251af9db7727138f" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.742213 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.742197204 podStartE2EDuration="3.742197204s" podCreationTimestamp="2026-03-20 13:46:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:44.741868395 +0000 UTC m=+1419.622894535" watchObservedRunningTime="2026-03-20 13:46:44.742197204 +0000 UTC m=+1419.623223334" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.764441 4856 scope.go:117] "RemoveContainer" containerID="da242efea8441fa92082673b2ef73f8d0d22d70066cb80c609ae34d05ef595cf" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.778113 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.803683 4856 scope.go:117] "RemoveContainer" containerID="efd6d46825ed6248aeeb346cdc60629b75b7ee308f46ca092d2d0880d00e924a" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.826257 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.844549 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:44 crc kubenswrapper[4856]: E0320 13:46:44.844950 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-central-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.844966 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-central-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: E0320 13:46:44.844982 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="sg-core" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.844990 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="sg-core" Mar 20 13:46:44 crc kubenswrapper[4856]: E0320 13:46:44.845011 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-notification-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845017 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-notification-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: E0320 13:46:44.845030 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="proxy-httpd" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845036 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="proxy-httpd" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845194 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-central-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845210 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="proxy-httpd" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845224 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="sg-core" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.845238 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" containerName="ceilometer-notification-agent" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.847976 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.854922 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.859905 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.859983 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.859915 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989718 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989761 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989781 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989804 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989821 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpj26\" (UniqueName: \"kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989922 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.989979 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:44 crc kubenswrapper[4856]: I0320 13:46:44.990034 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091687 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091772 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091820 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091871 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091890 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091911 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091937 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.091958 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpj26\" (UniqueName: \"kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.096073 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.097034 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.097898 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.099666 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.100130 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.100765 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.101659 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.113578 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpj26\" (UniqueName: \"kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26\") pod \"ceilometer-0\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.216306 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.665342 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:45 crc kubenswrapper[4856]: W0320 13:46:45.675635 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4d9436f_69cb_4596_a7bb_5d8b627fb096.slice/crio-e66cf235e60cab3fcfe039e384cfb9a4e35fe19ef58358bf30bc20e655626650 WatchSource:0}: Error finding container e66cf235e60cab3fcfe039e384cfb9a4e35fe19ef58358bf30bc20e655626650: Status 404 returned error can't find the container with id e66cf235e60cab3fcfe039e384cfb9a4e35fe19ef58358bf30bc20e655626650 Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.730695 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerStarted","Data":"e66cf235e60cab3fcfe039e384cfb9a4e35fe19ef58358bf30bc20e655626650"} Mar 20 13:46:45 crc kubenswrapper[4856]: I0320 13:46:45.835384 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc59c054-c214-4f61-b7dc-86d2f09b8b5e" path="/var/lib/kubelet/pods/bc59c054-c214-4f61-b7dc-86d2f09b8b5e/volumes" Mar 20 13:46:46 crc kubenswrapper[4856]: E0320 13:46:46.375487 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc376281a_ae9c_4057_a9ac_1ef731747830.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:46:47 crc kubenswrapper[4856]: I0320 13:46:47.754946 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerStarted","Data":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} Mar 20 13:46:47 crc kubenswrapper[4856]: I0320 13:46:47.758035 4856 generic.go:334] "Generic (PLEG): container finished" podID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerID="e9d320f8afc636ad4d1326713391272c7b71185bfa68270e3d375053bc653bec" exitCode=0 Mar 20 13:46:47 crc kubenswrapper[4856]: I0320 13:46:47.758077 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerDied","Data":"e9d320f8afc636ad4d1326713391272c7b71185bfa68270e3d375053bc653bec"} Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.013160 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.013775 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.044696 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.053552 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.802096 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:46:51 crc kubenswrapper[4856]: I0320 13:46:51.802150 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.041102 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.041522 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.076922 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.085670 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.428764 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.541164 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jv2x\" (UniqueName: \"kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x\") pod \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.541231 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config\") pod \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.541413 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs\") pod \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.541446 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle\") pod \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.541463 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config\") pod \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\" (UID: \"c2782efe-c7dc-4301-a897-cfe6a08aa7fb\") " Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.550785 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "c2782efe-c7dc-4301-a897-cfe6a08aa7fb" (UID: "c2782efe-c7dc-4301-a897-cfe6a08aa7fb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.553889 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x" (OuterVolumeSpecName: "kube-api-access-8jv2x") pod "c2782efe-c7dc-4301-a897-cfe6a08aa7fb" (UID: "c2782efe-c7dc-4301-a897-cfe6a08aa7fb"). InnerVolumeSpecName "kube-api-access-8jv2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.594209 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c2782efe-c7dc-4301-a897-cfe6a08aa7fb" (UID: "c2782efe-c7dc-4301-a897-cfe6a08aa7fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.601477 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config" (OuterVolumeSpecName: "config") pod "c2782efe-c7dc-4301-a897-cfe6a08aa7fb" (UID: "c2782efe-c7dc-4301-a897-cfe6a08aa7fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.615336 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "c2782efe-c7dc-4301-a897-cfe6a08aa7fb" (UID: "c2782efe-c7dc-4301-a897-cfe6a08aa7fb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.653113 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jv2x\" (UniqueName: \"kubernetes.io/projected/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-kube-api-access-8jv2x\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.653180 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.653197 4856 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.653215 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.653231 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2782efe-c7dc-4301-a897-cfe6a08aa7fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.814731 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67975c5cc6-2d96h" event={"ID":"c2782efe-c7dc-4301-a897-cfe6a08aa7fb","Type":"ContainerDied","Data":"303e8852dc2ce0e339ae1a0ccf198eb9a0fbaa193fa851df8183fa15807c76c3"} Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.814793 4856 scope.go:117] "RemoveContainer" containerID="f1443a7d3d86f2f83dec5d6552b8e2ae4efe9881c4210ba9a130c8efc2674cfa" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.814972 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67975c5cc6-2d96h" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.816667 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.816707 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.855975 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.864160 4856 scope.go:117] "RemoveContainer" containerID="e9d320f8afc636ad4d1326713391272c7b71185bfa68270e3d375053bc653bec" Mar 20 13:46:52 crc kubenswrapper[4856]: I0320 13:46:52.865405 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-67975c5cc6-2d96h"] Mar 20 13:46:53 crc kubenswrapper[4856]: I0320 13:46:53.841330 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" path="/var/lib/kubelet/pods/c2782efe-c7dc-4301-a897-cfe6a08aa7fb/volumes" Mar 20 13:46:53 crc kubenswrapper[4856]: I0320 13:46:53.856055 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dms79" event={"ID":"131e451e-458e-4955-b559-b0eefb86cf25","Type":"ContainerStarted","Data":"44ee3199e8ef2be02919b100ea3ac3eb65af208ff9499ecb07ccb5204ef2e122"} Mar 20 13:46:53 crc kubenswrapper[4856]: I0320 13:46:53.859713 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerStarted","Data":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} Mar 20 13:46:53 crc kubenswrapper[4856]: I0320 13:46:53.859746 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerStarted","Data":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} Mar 20 13:46:53 crc kubenswrapper[4856]: I0320 13:46:53.893310 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dms79" podStartSLOduration=2.502012615 podStartE2EDuration="11.893287505s" podCreationTimestamp="2026-03-20 13:46:42 +0000 UTC" firstStartedPulling="2026-03-20 13:46:43.454807751 +0000 UTC m=+1418.335833881" lastFinishedPulling="2026-03-20 13:46:52.846082641 +0000 UTC m=+1427.727108771" observedRunningTime="2026-03-20 13:46:53.882086937 +0000 UTC m=+1428.763113067" watchObservedRunningTime="2026-03-20 13:46:53.893287505 +0000 UTC m=+1428.774313655" Mar 20 13:46:54 crc kubenswrapper[4856]: I0320 13:46:54.060482 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:54 crc kubenswrapper[4856]: I0320 13:46:54.061080 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:54 crc kubenswrapper[4856]: I0320 13:46:54.068193 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.183912 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.184036 4856 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.292943 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.877530 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerStarted","Data":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.878091 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:55 crc kubenswrapper[4856]: I0320 13:46:55.901217 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2189960380000002 podStartE2EDuration="11.901195742s" podCreationTimestamp="2026-03-20 13:46:44 +0000 UTC" firstStartedPulling="2026-03-20 13:46:45.679586488 +0000 UTC m=+1420.560612618" lastFinishedPulling="2026-03-20 13:46:55.361786192 +0000 UTC m=+1430.242812322" observedRunningTime="2026-03-20 13:46:55.899555786 +0000 UTC m=+1430.780581926" watchObservedRunningTime="2026-03-20 13:46:55.901195742 +0000 UTC m=+1430.782221872" Mar 20 13:46:56 crc kubenswrapper[4856]: E0320 13:46:56.603373 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc376281a_ae9c_4057_a9ac_1ef731747830.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:46:57 crc kubenswrapper[4856]: I0320 13:46:57.638627 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:57 crc kubenswrapper[4856]: I0320 13:46:57.907854 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-central-agent" containerID="cri-o://1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4856]: I0320 13:46:57.908441 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="proxy-httpd" containerID="cri-o://a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4856]: I0320 13:46:57.908501 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="sg-core" containerID="cri-o://45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4856]: I0320 13:46:57.908546 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-notification-agent" containerID="cri-o://7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" gracePeriod=30 Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.648988 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775085 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775184 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775254 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775476 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpj26\" (UniqueName: \"kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775536 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775586 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.775605 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data\") pod \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\" (UID: \"d4d9436f-69cb-4596-a7bb-5d8b627fb096\") " Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.776452 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.776507 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.781040 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26" (OuterVolumeSpecName: "kube-api-access-kpj26") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "kube-api-access-kpj26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.781450 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts" (OuterVolumeSpecName: "scripts") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.806394 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.830068 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.868713 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878106 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878379 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878457 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d4d9436f-69cb-4596-a7bb-5d8b627fb096-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878538 4856 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878620 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpj26\" (UniqueName: \"kubernetes.io/projected/d4d9436f-69cb-4596-a7bb-5d8b627fb096-kube-api-access-kpj26\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878683 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.878745 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.885531 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data" (OuterVolumeSpecName: "config-data") pod "d4d9436f-69cb-4596-a7bb-5d8b627fb096" (UID: "d4d9436f-69cb-4596-a7bb-5d8b627fb096"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920373 4856 generic.go:334] "Generic (PLEG): container finished" podID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" exitCode=0 Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920420 4856 generic.go:334] "Generic (PLEG): container finished" podID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" exitCode=2 Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920427 4856 generic.go:334] "Generic (PLEG): container finished" podID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" exitCode=0 Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920434 4856 generic.go:334] "Generic (PLEG): container finished" podID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" exitCode=0 Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920451 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerDied","Data":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920476 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerDied","Data":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920502 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerDied","Data":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920511 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerDied","Data":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920519 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d4d9436f-69cb-4596-a7bb-5d8b627fb096","Type":"ContainerDied","Data":"e66cf235e60cab3fcfe039e384cfb9a4e35fe19ef58358bf30bc20e655626650"} Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920535 4856 scope.go:117] "RemoveContainer" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.920682 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.955643 4856 scope.go:117] "RemoveContainer" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.968586 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.979503 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.980746 4856 scope.go:117] "RemoveContainer" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.980810 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d9436f-69cb-4596-a7bb-5d8b627fb096-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992342 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992810 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992831 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992848 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-api" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992857 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-api" Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992876 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-central-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992883 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-central-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992910 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="proxy-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992918 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="proxy-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992937 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-notification-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992945 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-notification-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: E0320 13:46:58.992964 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="sg-core" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.992972 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="sg-core" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993204 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-api" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993220 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="proxy-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993232 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-central-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993241 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="ceilometer-notification-agent" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993297 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2782efe-c7dc-4301-a897-cfe6a08aa7fb" containerName="neutron-httpd" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.993307 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" containerName="sg-core" Mar 20 13:46:58 crc kubenswrapper[4856]: I0320 13:46:58.995216 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.002061 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.002358 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.002550 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.016612 4856 scope.go:117] "RemoveContainer" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.022349 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.082502 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.082711 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.082829 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96zx\" (UniqueName: \"kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.082913 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.083138 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.083234 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.083314 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.083389 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.103562 4856 scope.go:117] "RemoveContainer" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:59 crc kubenswrapper[4856]: E0320 13:46:59.104005 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": container with ID starting with a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913 not found: ID does not exist" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.104051 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} err="failed to get container status \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": rpc error: code = NotFound desc = could not find container \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": container with ID starting with a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.104078 4856 scope.go:117] "RemoveContainer" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:59 crc kubenswrapper[4856]: E0320 13:46:59.104465 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": container with ID starting with 45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c not found: ID does not exist" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.104511 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} err="failed to get container status \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": rpc error: code = NotFound desc = could not find container \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": container with ID starting with 45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.104541 4856 scope.go:117] "RemoveContainer" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:59 crc kubenswrapper[4856]: E0320 13:46:59.105233 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": container with ID starting with 7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412 not found: ID does not exist" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105297 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} err="failed to get container status \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": rpc error: code = NotFound desc = could not find container \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": container with ID starting with 7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105316 4856 scope.go:117] "RemoveContainer" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: E0320 13:46:59.105547 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": container with ID starting with 1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e not found: ID does not exist" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105569 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} err="failed to get container status \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": rpc error: code = NotFound desc = could not find container \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": container with ID starting with 1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105582 4856 scope.go:117] "RemoveContainer" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105877 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} err="failed to get container status \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": rpc error: code = NotFound desc = could not find container \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": container with ID starting with a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.105917 4856 scope.go:117] "RemoveContainer" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.106174 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} err="failed to get container status \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": rpc error: code = NotFound desc = could not find container \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": container with ID starting with 45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.106198 4856 scope.go:117] "RemoveContainer" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.106649 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} err="failed to get container status \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": rpc error: code = NotFound desc = could not find container \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": container with ID starting with 7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.106674 4856 scope.go:117] "RemoveContainer" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107054 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} err="failed to get container status \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": rpc error: code = NotFound desc = could not find container \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": container with ID starting with 1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107079 4856 scope.go:117] "RemoveContainer" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107415 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} err="failed to get container status \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": rpc error: code = NotFound desc = could not find container \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": container with ID starting with a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107463 4856 scope.go:117] "RemoveContainer" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107711 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} err="failed to get container status \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": rpc error: code = NotFound desc = could not find container \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": container with ID starting with 45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107731 4856 scope.go:117] "RemoveContainer" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107969 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} err="failed to get container status \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": rpc error: code = NotFound desc = could not find container \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": container with ID starting with 7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.107992 4856 scope.go:117] "RemoveContainer" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.108205 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} err="failed to get container status \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": rpc error: code = NotFound desc = could not find container \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": container with ID starting with 1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.108246 4856 scope.go:117] "RemoveContainer" containerID="a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.108558 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913"} err="failed to get container status \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": rpc error: code = NotFound desc = could not find container \"a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913\": container with ID starting with a1b562ccf3e8e22d41757dc4c2011d1b66d34b1fc9482a9eea759fc407ad7913 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.108583 4856 scope.go:117] "RemoveContainer" containerID="45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.109727 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c"} err="failed to get container status \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": rpc error: code = NotFound desc = could not find container \"45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c\": container with ID starting with 45bc9187705e2262c913ecc3eba313d89a72e5224b3ae51e5b1b5b00632b599c not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.109754 4856 scope.go:117] "RemoveContainer" containerID="7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.110058 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412"} err="failed to get container status \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": rpc error: code = NotFound desc = could not find container \"7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412\": container with ID starting with 7e3df463e31df7a540e9450808f125e9f50870b48a008a24257ed02f9253b412 not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.110077 4856 scope.go:117] "RemoveContainer" containerID="1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.110517 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e"} err="failed to get container status \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": rpc error: code = NotFound desc = could not find container \"1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e\": container with ID starting with 1a23ce8f2e2524d71a8b7ff4588c38363bdbc9a59f67a28955916a1edf9dc53e not found: ID does not exist" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.184934 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185015 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185079 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185103 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185144 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x96zx\" (UniqueName: \"kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185177 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185254 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185403 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185525 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.185983 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.189951 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.190102 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.190616 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.190660 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.192724 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.206286 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96zx\" (UniqueName: \"kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx\") pod \"ceilometer-0\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.391726 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.831204 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4d9436f-69cb-4596-a7bb-5d8b627fb096" path="/var/lib/kubelet/pods/d4d9436f-69cb-4596-a7bb-5d8b627fb096/volumes" Mar 20 13:46:59 crc kubenswrapper[4856]: W0320 13:46:59.858233 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c5ebb05_0f0f_46ba_b5a0_f225973b4e03.slice/crio-62003e4a049c4400fda9b02f1372627e59526763c170c75405b9b0d26892927d WatchSource:0}: Error finding container 62003e4a049c4400fda9b02f1372627e59526763c170c75405b9b0d26892927d: Status 404 returned error can't find the container with id 62003e4a049c4400fda9b02f1372627e59526763c170c75405b9b0d26892927d Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.859059 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:59 crc kubenswrapper[4856]: I0320 13:46:59.930329 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerStarted","Data":"62003e4a049c4400fda9b02f1372627e59526763c170c75405b9b0d26892927d"} Mar 20 13:47:00 crc kubenswrapper[4856]: I0320 13:47:00.940598 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerStarted","Data":"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de"} Mar 20 13:47:01 crc kubenswrapper[4856]: I0320 13:47:01.953967 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerStarted","Data":"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e"} Mar 20 13:47:02 crc kubenswrapper[4856]: I0320 13:47:02.965769 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerStarted","Data":"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778"} Mar 20 13:47:05 crc kubenswrapper[4856]: I0320 13:47:05.999126 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerStarted","Data":"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758"} Mar 20 13:47:06 crc kubenswrapper[4856]: I0320 13:47:05.999764 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:47:06 crc kubenswrapper[4856]: I0320 13:47:06.024131 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.648361874 podStartE2EDuration="8.02411473s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.860686041 +0000 UTC m=+1434.741712171" lastFinishedPulling="2026-03-20 13:47:05.236438897 +0000 UTC m=+1440.117465027" observedRunningTime="2026-03-20 13:47:06.020453365 +0000 UTC m=+1440.901479505" watchObservedRunningTime="2026-03-20 13:47:06.02411473 +0000 UTC m=+1440.905140860" Mar 20 13:47:16 crc kubenswrapper[4856]: I0320 13:47:16.092408 4856 generic.go:334] "Generic (PLEG): container finished" podID="131e451e-458e-4955-b559-b0eefb86cf25" containerID="44ee3199e8ef2be02919b100ea3ac3eb65af208ff9499ecb07ccb5204ef2e122" exitCode=0 Mar 20 13:47:16 crc kubenswrapper[4856]: I0320 13:47:16.092444 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dms79" event={"ID":"131e451e-458e-4955-b559-b0eefb86cf25","Type":"ContainerDied","Data":"44ee3199e8ef2be02919b100ea3ac3eb65af208ff9499ecb07ccb5204ef2e122"} Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.541481 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.678182 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle\") pod \"131e451e-458e-4955-b559-b0eefb86cf25\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.678326 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts\") pod \"131e451e-458e-4955-b559-b0eefb86cf25\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.678400 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wqk6\" (UniqueName: \"kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6\") pod \"131e451e-458e-4955-b559-b0eefb86cf25\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.678528 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data\") pod \"131e451e-458e-4955-b559-b0eefb86cf25\" (UID: \"131e451e-458e-4955-b559-b0eefb86cf25\") " Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.683789 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts" (OuterVolumeSpecName: "scripts") pod "131e451e-458e-4955-b559-b0eefb86cf25" (UID: "131e451e-458e-4955-b559-b0eefb86cf25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.684559 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6" (OuterVolumeSpecName: "kube-api-access-8wqk6") pod "131e451e-458e-4955-b559-b0eefb86cf25" (UID: "131e451e-458e-4955-b559-b0eefb86cf25"). InnerVolumeSpecName "kube-api-access-8wqk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.708923 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data" (OuterVolumeSpecName: "config-data") pod "131e451e-458e-4955-b559-b0eefb86cf25" (UID: "131e451e-458e-4955-b559-b0eefb86cf25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.721685 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "131e451e-458e-4955-b559-b0eefb86cf25" (UID: "131e451e-458e-4955-b559-b0eefb86cf25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.781558 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.781614 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.781635 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wqk6\" (UniqueName: \"kubernetes.io/projected/131e451e-458e-4955-b559-b0eefb86cf25-kube-api-access-8wqk6\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:17 crc kubenswrapper[4856]: I0320 13:47:17.781655 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/131e451e-458e-4955-b559-b0eefb86cf25-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.112177 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dms79" event={"ID":"131e451e-458e-4955-b559-b0eefb86cf25","Type":"ContainerDied","Data":"4e42d47af8f7b3b048d5cb911fe8c57aff0818e405f79e953a6b265a7ac1d125"} Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.112242 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e42d47af8f7b3b048d5cb911fe8c57aff0818e405f79e953a6b265a7ac1d125" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.112314 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dms79" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.296058 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:18 crc kubenswrapper[4856]: E0320 13:47:18.296708 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131e451e-458e-4955-b559-b0eefb86cf25" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.296744 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="131e451e-458e-4955-b559-b0eefb86cf25" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.297063 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="131e451e-458e-4955-b559-b0eefb86cf25" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.297998 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.299888 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.300585 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qkrfk" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.314587 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.393258 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbk4\" (UniqueName: \"kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.393362 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.393491 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.494708 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.494947 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.495032 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbk4\" (UniqueName: \"kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.500454 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.501686 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.518875 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbk4\" (UniqueName: \"kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4\") pod \"nova-cell0-conductor-0\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:18 crc kubenswrapper[4856]: I0320 13:47:18.617243 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:19 crc kubenswrapper[4856]: I0320 13:47:19.051256 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:19 crc kubenswrapper[4856]: I0320 13:47:19.122728 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b314fa97-2e86-46ef-8034-97bb179a3139","Type":"ContainerStarted","Data":"4c9871dd58a8c80f84c25c7883deecb0e49b93852b1e1481ed3302180e7ffe73"} Mar 20 13:47:20 crc kubenswrapper[4856]: I0320 13:47:20.134455 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b314fa97-2e86-46ef-8034-97bb179a3139","Type":"ContainerStarted","Data":"6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f"} Mar 20 13:47:20 crc kubenswrapper[4856]: I0320 13:47:20.134796 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:20 crc kubenswrapper[4856]: I0320 13:47:20.165619 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.165598009 podStartE2EDuration="2.165598009s" podCreationTimestamp="2026-03-20 13:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:20.154514735 +0000 UTC m=+1455.035540875" watchObservedRunningTime="2026-03-20 13:47:20.165598009 +0000 UTC m=+1455.046624139" Mar 20 13:47:28 crc kubenswrapper[4856]: I0320 13:47:28.654084 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.167579 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qm47c"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.168855 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.171108 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.172509 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.184674 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qm47c"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.294550 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.294844 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwg8\" (UniqueName: \"kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.295031 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.295110 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.313882 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.314914 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.322471 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.329548 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.396797 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397031 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx5dl\" (UniqueName: \"kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397101 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwg8\" (UniqueName: \"kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397224 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397284 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397316 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.397358 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.406002 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.417093 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.448497 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwg8\" (UniqueName: \"kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.449164 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qm47c\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.501810 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.501895 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.502080 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx5dl\" (UniqueName: \"kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.505417 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.522295 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.533002 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.544919 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.554056 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.554635 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.568785 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.585262 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.585897 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx5dl\" (UniqueName: \"kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl\") pod \"nova-cell1-novncproxy-0\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.586751 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.602724 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.604350 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.604395 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.604457 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4xt\" (UniqueName: \"kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.604584 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.630728 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.634717 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.660973 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.691518 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.695876 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.705889 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706221 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706315 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rncs\" (UniqueName: \"kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706375 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706396 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706443 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706476 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.706495 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4xt\" (UniqueName: \"kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.707566 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.717629 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.719409 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.721497 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.722558 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.728729 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.736799 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4xt\" (UniqueName: \"kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt\") pod \"nova-metadata-0\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.739464 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.773573 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808193 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808234 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pll\" (UniqueName: \"kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808254 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808325 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808340 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmhgj\" (UniqueName: \"kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808363 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rncs\" (UniqueName: \"kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808421 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808504 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808540 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808646 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808679 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808767 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.808793 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.812055 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.812701 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.819840 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.837927 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rncs\" (UniqueName: \"kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs\") pod \"nova-api-0\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " pod="openstack/nova-api-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.910664 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911065 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911100 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pll\" (UniqueName: \"kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911122 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911252 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmhgj\" (UniqueName: \"kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911288 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911416 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911486 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.911541 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.914836 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.915486 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.920163 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.921304 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.922064 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.935831 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.940545 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pll\" (UniqueName: \"kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.943400 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmhgj\" (UniqueName: \"kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj\") pod \"dnsmasq-dns-757b4f8459-z6d2g\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.946863 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.987984 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.990586 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qm47c"] Mar 20 13:47:29 crc kubenswrapper[4856]: I0320 13:47:29.997778 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.024066 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.057711 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.274122 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qm47c" event={"ID":"f5f44803-700d-498a-819c-881f5959b477","Type":"ContainerStarted","Data":"34743ab6cb2152fe89644cb800763f0838b36549a158ebf04f53460c1b9f6a0a"} Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.326409 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.370596 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2ngth"] Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.371965 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.374808 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.375154 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.381651 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2ngth"] Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.427037 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkbks\" (UniqueName: \"kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.427243 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.427361 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.427666 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.531038 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkbks\" (UniqueName: \"kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.531287 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.531312 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.531383 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.538977 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.540783 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.547139 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.568845 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkbks\" (UniqueName: \"kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks\") pod \"nova-cell1-conductor-db-sync-2ngth\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.601674 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:30 crc kubenswrapper[4856]: W0320 13:47:30.619517 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c2bb138_cdb6_4faa_abf4_3b3a9db0cecb.slice/crio-a48744cf62921358f594eb47bf333396fbc94a3764057d22ebc65c23c15de337 WatchSource:0}: Error finding container a48744cf62921358f594eb47bf333396fbc94a3764057d22ebc65c23c15de337: Status 404 returned error can't find the container with id a48744cf62921358f594eb47bf333396fbc94a3764057d22ebc65c23c15de337 Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.692920 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.864920 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.935422 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:47:30 crc kubenswrapper[4856]: I0320 13:47:30.949494 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:30 crc kubenswrapper[4856]: W0320 13:47:30.966979 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d3e255b_faee_4024_96dd_6e10ef862fb6.slice/crio-04761d07656a616fa7d0a9e1a46514d2d6df1de62a9b553703a38bf81c19988a WatchSource:0}: Error finding container 04761d07656a616fa7d0a9e1a46514d2d6df1de62a9b553703a38bf81c19988a: Status 404 returned error can't find the container with id 04761d07656a616fa7d0a9e1a46514d2d6df1de62a9b553703a38bf81c19988a Mar 20 13:47:30 crc kubenswrapper[4856]: W0320 13:47:30.988083 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod204d3eaf_06d1_4c05_a2a9_01d424229125.slice/crio-0920b89e71ad359a91c5bf5dfe6a6188d533a141b05a0c055d3bd315d49b6798 WatchSource:0}: Error finding container 0920b89e71ad359a91c5bf5dfe6a6188d533a141b05a0c055d3bd315d49b6798: Status 404 returned error can't find the container with id 0920b89e71ad359a91c5bf5dfe6a6188d533a141b05a0c055d3bd315d49b6798 Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.285319 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerStarted","Data":"a48744cf62921358f594eb47bf333396fbc94a3764057d22ebc65c23c15de337"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.303916 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qm47c" event={"ID":"f5f44803-700d-498a-819c-881f5959b477","Type":"ContainerStarted","Data":"77e3580170a69870b08acfefe46b48cd21070ff07cffab50cec1275d006beea4"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.308509 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerStarted","Data":"2f8e3e143c8496da6ccb2163dce114265bd091e65ac679ba41d513d9fed745ae"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.314432 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" event={"ID":"204d3eaf-06d1-4c05-a2a9-01d424229125","Type":"ContainerStarted","Data":"0920b89e71ad359a91c5bf5dfe6a6188d533a141b05a0c055d3bd315d49b6798"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.318071 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb9ec39c-3fd5-4477-9868-44e5424f9bb3","Type":"ContainerStarted","Data":"c500d3df8081746eaa449528a230fd690e9fb4c5c929194cc7f481f7c8f43e57"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.321790 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qm47c" podStartSLOduration=2.321620501 podStartE2EDuration="2.321620501s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:31.320337046 +0000 UTC m=+1466.201363186" watchObservedRunningTime="2026-03-20 13:47:31.321620501 +0000 UTC m=+1466.202646631" Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.324523 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d3e255b-faee-4024-96dd-6e10ef862fb6","Type":"ContainerStarted","Data":"04761d07656a616fa7d0a9e1a46514d2d6df1de62a9b553703a38bf81c19988a"} Mar 20 13:47:31 crc kubenswrapper[4856]: I0320 13:47:31.437944 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2ngth"] Mar 20 13:47:32 crc kubenswrapper[4856]: I0320 13:47:32.336647 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2ngth" event={"ID":"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0","Type":"ContainerStarted","Data":"7c030a52fa49c1ad27d857f75abf652057f78f1ef66883ba32b85866d7b6baf7"} Mar 20 13:47:32 crc kubenswrapper[4856]: I0320 13:47:32.337018 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2ngth" event={"ID":"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0","Type":"ContainerStarted","Data":"0ffc7a1d0d72910ed0d475ce38a19892a17d5c9cf6ad09cdcebdcb5ba828fd85"} Mar 20 13:47:32 crc kubenswrapper[4856]: I0320 13:47:32.339293 4856 generic.go:334] "Generic (PLEG): container finished" podID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerID="834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8" exitCode=0 Mar 20 13:47:32 crc kubenswrapper[4856]: I0320 13:47:32.339390 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" event={"ID":"204d3eaf-06d1-4c05-a2a9-01d424229125","Type":"ContainerDied","Data":"834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8"} Mar 20 13:47:32 crc kubenswrapper[4856]: I0320 13:47:32.356038 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2ngth" podStartSLOduration=2.356015122 podStartE2EDuration="2.356015122s" podCreationTimestamp="2026-03-20 13:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:32.352191693 +0000 UTC m=+1467.233217823" watchObservedRunningTime="2026-03-20 13:47:32.356015122 +0000 UTC m=+1467.237041252" Mar 20 13:47:33 crc kubenswrapper[4856]: I0320 13:47:33.590985 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:47:33 crc kubenswrapper[4856]: I0320 13:47:33.601258 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.366539 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerStarted","Data":"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.368101 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerStarted","Data":"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.369504 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" event={"ID":"204d3eaf-06d1-4c05-a2a9-01d424229125","Type":"ContainerStarted","Data":"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.369626 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.371629 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb9ec39c-3fd5-4477-9868-44e5424f9bb3","Type":"ContainerStarted","Data":"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.371693 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169" gracePeriod=30 Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.377698 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d3e255b-faee-4024-96dd-6e10ef862fb6","Type":"ContainerStarted","Data":"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.379440 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerStarted","Data":"dd616a65806e32ecf6ca977b0fd2d7643d582d6c8e1e034aa71600f4b05c27ff"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.379478 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerStarted","Data":"b8a533331e86e9b40051e2ce798be835d7717ec2238a43974bf44c29d23448b3"} Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.379583 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-log" containerID="cri-o://b8a533331e86e9b40051e2ce798be835d7717ec2238a43974bf44c29d23448b3" gracePeriod=30 Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.379591 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-metadata" containerID="cri-o://dd616a65806e32ecf6ca977b0fd2d7643d582d6c8e1e034aa71600f4b05c27ff" gracePeriod=30 Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.389502 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.763425876 podStartE2EDuration="6.38947759s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="2026-03-20 13:47:30.901245153 +0000 UTC m=+1465.782271283" lastFinishedPulling="2026-03-20 13:47:34.527296867 +0000 UTC m=+1469.408322997" observedRunningTime="2026-03-20 13:47:35.382252976 +0000 UTC m=+1470.263279126" watchObservedRunningTime="2026-03-20 13:47:35.38947759 +0000 UTC m=+1470.270503720" Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.429646 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" podStartSLOduration=6.429624117 podStartE2EDuration="6.429624117s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:35.423211926 +0000 UTC m=+1470.304238066" watchObservedRunningTime="2026-03-20 13:47:35.429624117 +0000 UTC m=+1470.310650267" Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.446078 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5486726429999997 podStartE2EDuration="6.446058433s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="2026-03-20 13:47:30.626630255 +0000 UTC m=+1465.507656385" lastFinishedPulling="2026-03-20 13:47:34.524016045 +0000 UTC m=+1469.405042175" observedRunningTime="2026-03-20 13:47:35.441039301 +0000 UTC m=+1470.322065441" watchObservedRunningTime="2026-03-20 13:47:35.446058433 +0000 UTC m=+1470.327084563" Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.469524 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.929245644 podStartE2EDuration="6.469500317s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="2026-03-20 13:47:30.982969779 +0000 UTC m=+1465.863995909" lastFinishedPulling="2026-03-20 13:47:34.523224452 +0000 UTC m=+1469.404250582" observedRunningTime="2026-03-20 13:47:35.463813606 +0000 UTC m=+1470.344839746" watchObservedRunningTime="2026-03-20 13:47:35.469500317 +0000 UTC m=+1470.350526447" Mar 20 13:47:35 crc kubenswrapper[4856]: I0320 13:47:35.485039 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.329182865 podStartE2EDuration="6.485019786s" podCreationTimestamp="2026-03-20 13:47:29 +0000 UTC" firstStartedPulling="2026-03-20 13:47:30.369578913 +0000 UTC m=+1465.250605043" lastFinishedPulling="2026-03-20 13:47:34.525415824 +0000 UTC m=+1469.406441964" observedRunningTime="2026-03-20 13:47:35.478335907 +0000 UTC m=+1470.359362047" watchObservedRunningTime="2026-03-20 13:47:35.485019786 +0000 UTC m=+1470.366045926" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.402841 4856 generic.go:334] "Generic (PLEG): container finished" podID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerID="dd616a65806e32ecf6ca977b0fd2d7643d582d6c8e1e034aa71600f4b05c27ff" exitCode=0 Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.403148 4856 generic.go:334] "Generic (PLEG): container finished" podID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerID="b8a533331e86e9b40051e2ce798be835d7717ec2238a43974bf44c29d23448b3" exitCode=143 Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.403080 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerDied","Data":"dd616a65806e32ecf6ca977b0fd2d7643d582d6c8e1e034aa71600f4b05c27ff"} Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.404229 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerDied","Data":"b8a533331e86e9b40051e2ce798be835d7717ec2238a43974bf44c29d23448b3"} Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.551209 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.617082 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs\") pod \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.617501 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs" (OuterVolumeSpecName: "logs") pod "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" (UID: "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.617771 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t4xt\" (UniqueName: \"kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt\") pod \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.618605 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data\") pod \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.618841 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle\") pod \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\" (UID: \"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb\") " Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.619506 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.624444 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt" (OuterVolumeSpecName: "kube-api-access-8t4xt") pod "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" (UID: "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb"). InnerVolumeSpecName "kube-api-access-8t4xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.645489 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data" (OuterVolumeSpecName: "config-data") pod "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" (UID: "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.646678 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" (UID: "4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.721425 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.721468 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:36 crc kubenswrapper[4856]: I0320 13:47:36.721484 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8t4xt\" (UniqueName: \"kubernetes.io/projected/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb-kube-api-access-8t4xt\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.415152 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.415266 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb","Type":"ContainerDied","Data":"a48744cf62921358f594eb47bf333396fbc94a3764057d22ebc65c23c15de337"} Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.416539 4856 scope.go:117] "RemoveContainer" containerID="dd616a65806e32ecf6ca977b0fd2d7643d582d6c8e1e034aa71600f4b05c27ff" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.460547 4856 scope.go:117] "RemoveContainer" containerID="b8a533331e86e9b40051e2ce798be835d7717ec2238a43974bf44c29d23448b3" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.563569 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.637772 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.654716 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:37 crc kubenswrapper[4856]: E0320 13:47:37.655759 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-log" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.655779 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-log" Mar 20 13:47:37 crc kubenswrapper[4856]: E0320 13:47:37.655821 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-metadata" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.655854 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-metadata" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.656303 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-log" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.656367 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" containerName="nova-metadata-metadata" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.658151 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.685521 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.686192 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.688372 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.746398 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.746544 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.746603 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgprv\" (UniqueName: \"kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.746674 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.746703 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: E0320 13:47:37.826541 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c2bb138_cdb6_4faa_abf4_3b3a9db0cecb.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.829378 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb" path="/var/lib/kubelet/pods/4c2bb138-cdb6-4faa-abf4-3b3a9db0cecb/volumes" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.848666 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.848753 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.848813 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.848926 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.848987 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgprv\" (UniqueName: \"kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.852273 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.855535 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.856165 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.856262 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:37 crc kubenswrapper[4856]: I0320 13:47:37.867395 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgprv\" (UniqueName: \"kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv\") pod \"nova-metadata-0\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " pod="openstack/nova-metadata-0" Mar 20 13:47:38 crc kubenswrapper[4856]: I0320 13:47:38.013986 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:38 crc kubenswrapper[4856]: I0320 13:47:38.507523 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.476807 4856 generic.go:334] "Generic (PLEG): container finished" podID="f5f44803-700d-498a-819c-881f5959b477" containerID="77e3580170a69870b08acfefe46b48cd21070ff07cffab50cec1275d006beea4" exitCode=0 Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.476896 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qm47c" event={"ID":"f5f44803-700d-498a-819c-881f5959b477","Type":"ContainerDied","Data":"77e3580170a69870b08acfefe46b48cd21070ff07cffab50cec1275d006beea4"} Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.479576 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerStarted","Data":"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d"} Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.479615 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerStarted","Data":"2124ac7d1e92c360dc72077356f4ef8606f5edd062efb1381f78aac0a629272d"} Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.635813 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.998932 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:39 crc kubenswrapper[4856]: I0320 13:47:39.999258 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.028530 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.058790 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.059908 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.094365 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.094618 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="dnsmasq-dns" containerID="cri-o://0b2aff424ce009b50e72675ea6c6d94b540a7fe671e015a60a1c1a25e9891713" gracePeriod=10 Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.133202 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.497483 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerStarted","Data":"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0"} Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.522964 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.522944614 podStartE2EDuration="3.522944614s" podCreationTimestamp="2026-03-20 13:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:40.517811338 +0000 UTC m=+1475.398837478" watchObservedRunningTime="2026-03-20 13:47:40.522944614 +0000 UTC m=+1475.403970744" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.533839 4856 generic.go:334] "Generic (PLEG): container finished" podID="5e146b98-057f-467e-994a-1fabff7911bd" containerID="0b2aff424ce009b50e72675ea6c6d94b540a7fe671e015a60a1c1a25e9891713" exitCode=0 Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.534037 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" event={"ID":"5e146b98-057f-467e-994a-1fabff7911bd","Type":"ContainerDied","Data":"0b2aff424ce009b50e72675ea6c6d94b540a7fe671e015a60a1c1a25e9891713"} Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.605338 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.674977 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730153 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw5f2\" (UniqueName: \"kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730207 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730251 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730452 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730504 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.730523 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb\") pod \"5e146b98-057f-467e-994a-1fabff7911bd\" (UID: \"5e146b98-057f-467e-994a-1fabff7911bd\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.763052 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2" (OuterVolumeSpecName: "kube-api-access-kw5f2") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "kube-api-access-kw5f2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.802537 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.815931 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.822923 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.833323 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.833372 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.833385 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw5f2\" (UniqueName: \"kubernetes.io/projected/5e146b98-057f-467e-994a-1fabff7911bd-kube-api-access-kw5f2\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.833398 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.844483 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.856560 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config" (OuterVolumeSpecName: "config") pod "5e146b98-057f-467e-994a-1fabff7911bd" (UID: "5e146b98-057f-467e-994a-1fabff7911bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.876461 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.935143 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts\") pod \"f5f44803-700d-498a-819c-881f5959b477\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.935203 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data\") pod \"f5f44803-700d-498a-819c-881f5959b477\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.935375 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle\") pod \"f5f44803-700d-498a-819c-881f5959b477\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.935530 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkwg8\" (UniqueName: \"kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8\") pod \"f5f44803-700d-498a-819c-881f5959b477\" (UID: \"f5f44803-700d-498a-819c-881f5959b477\") " Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.936163 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.936188 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5e146b98-057f-467e-994a-1fabff7911bd-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.938541 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts" (OuterVolumeSpecName: "scripts") pod "f5f44803-700d-498a-819c-881f5959b477" (UID: "f5f44803-700d-498a-819c-881f5959b477"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.943768 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8" (OuterVolumeSpecName: "kube-api-access-lkwg8") pod "f5f44803-700d-498a-819c-881f5959b477" (UID: "f5f44803-700d-498a-819c-881f5959b477"). InnerVolumeSpecName "kube-api-access-lkwg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.962035 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5f44803-700d-498a-819c-881f5959b477" (UID: "f5f44803-700d-498a-819c-881f5959b477"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:40 crc kubenswrapper[4856]: I0320 13:47:40.979021 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data" (OuterVolumeSpecName: "config-data") pod "f5f44803-700d-498a-819c-881f5959b477" (UID: "f5f44803-700d-498a-819c-881f5959b477"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.037940 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.037976 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.037993 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5f44803-700d-498a-819c-881f5959b477-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.038006 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkwg8\" (UniqueName: \"kubernetes.io/projected/f5f44803-700d-498a-819c-881f5959b477-kube-api-access-lkwg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.084747 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.084610 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.552964 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qm47c" event={"ID":"f5f44803-700d-498a-819c-881f5959b477","Type":"ContainerDied","Data":"34743ab6cb2152fe89644cb800763f0838b36549a158ebf04f53460c1b9f6a0a"} Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.553522 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34743ab6cb2152fe89644cb800763f0838b36549a158ebf04f53460c1b9f6a0a" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.553025 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qm47c" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.559208 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" event={"ID":"5e146b98-057f-467e-994a-1fabff7911bd","Type":"ContainerDied","Data":"da87cd94f4587634846a8c3ab62eba016273b3935570b5e97d88e8cf2b3d85fa"} Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.559244 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-ztf4n" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.559319 4856 scope.go:117] "RemoveContainer" containerID="0b2aff424ce009b50e72675ea6c6d94b540a7fe671e015a60a1c1a25e9891713" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.598008 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.607982 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-ztf4n"] Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.691357 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.702816 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.703471 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-api" containerID="cri-o://c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57" gracePeriod=30 Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.703130 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-log" containerID="cri-o://d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244" gracePeriod=30 Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.717748 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.831668 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e146b98-057f-467e-994a-1fabff7911bd" path="/var/lib/kubelet/pods/5e146b98-057f-467e-994a-1fabff7911bd/volumes" Mar 20 13:47:41 crc kubenswrapper[4856]: I0320 13:47:41.908716 4856 scope.go:117] "RemoveContainer" containerID="bd7558b32a18f228234bf348b4f419616de27672d023997320562223c2465641" Mar 20 13:47:42 crc kubenswrapper[4856]: I0320 13:47:42.574955 4856 generic.go:334] "Generic (PLEG): container finished" podID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerID="d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244" exitCode=143 Mar 20 13:47:42 crc kubenswrapper[4856]: I0320 13:47:42.575421 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerDied","Data":"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244"} Mar 20 13:47:42 crc kubenswrapper[4856]: I0320 13:47:42.588070 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-log" containerID="cri-o://3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" gracePeriod=30 Mar 20 13:47:42 crc kubenswrapper[4856]: I0320 13:47:42.588397 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-metadata" containerID="cri-o://cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" gracePeriod=30 Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.573417 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602243 4856 generic.go:334] "Generic (PLEG): container finished" podID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerID="cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" exitCode=0 Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602311 4856 generic.go:334] "Generic (PLEG): container finished" podID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerID="3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" exitCode=143 Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602561 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerName="nova-scheduler-scheduler" containerID="cri-o://9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" gracePeriod=30 Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602882 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerDied","Data":"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0"} Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602931 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerDied","Data":"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d"} Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602944 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11185406-d5a7-4c9a-a85a-f944c255c20c","Type":"ContainerDied","Data":"2124ac7d1e92c360dc72077356f4ef8606f5edd062efb1381f78aac0a629272d"} Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602963 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.602979 4856 scope.go:117] "RemoveContainer" containerID="cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.633813 4856 scope.go:117] "RemoveContainer" containerID="3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.657488 4856 scope.go:117] "RemoveContainer" containerID="cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.658125 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0\": container with ID starting with cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0 not found: ID does not exist" containerID="cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.658178 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0"} err="failed to get container status \"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0\": rpc error: code = NotFound desc = could not find container \"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0\": container with ID starting with cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0 not found: ID does not exist" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.658204 4856 scope.go:117] "RemoveContainer" containerID="3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.658547 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d\": container with ID starting with 3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d not found: ID does not exist" containerID="3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.658586 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d"} err="failed to get container status \"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d\": rpc error: code = NotFound desc = could not find container \"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d\": container with ID starting with 3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d not found: ID does not exist" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.658605 4856 scope.go:117] "RemoveContainer" containerID="cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.658993 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0"} err="failed to get container status \"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0\": rpc error: code = NotFound desc = could not find container \"cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0\": container with ID starting with cc43db15d7f2af9019ddbd8ad76ee76f633a7a2685db21a5fd0bb05fc7af86d0 not found: ID does not exist" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.659020 4856 scope.go:117] "RemoveContainer" containerID="3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.659230 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d"} err="failed to get container status \"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d\": rpc error: code = NotFound desc = could not find container \"3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d\": container with ID starting with 3bee73e0b50d043dbae43d3ba2f8643a9148d817f3e04c5310a0df185d41619d not found: ID does not exist" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751135 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs\") pod \"11185406-d5a7-4c9a-a85a-f944c255c20c\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751197 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle\") pod \"11185406-d5a7-4c9a-a85a-f944c255c20c\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751230 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgprv\" (UniqueName: \"kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv\") pod \"11185406-d5a7-4c9a-a85a-f944c255c20c\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751373 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs\") pod \"11185406-d5a7-4c9a-a85a-f944c255c20c\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751431 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data\") pod \"11185406-d5a7-4c9a-a85a-f944c255c20c\" (UID: \"11185406-d5a7-4c9a-a85a-f944c255c20c\") " Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.751965 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs" (OuterVolumeSpecName: "logs") pod "11185406-d5a7-4c9a-a85a-f944c255c20c" (UID: "11185406-d5a7-4c9a-a85a-f944c255c20c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.757586 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv" (OuterVolumeSpecName: "kube-api-access-lgprv") pod "11185406-d5a7-4c9a-a85a-f944c255c20c" (UID: "11185406-d5a7-4c9a-a85a-f944c255c20c"). InnerVolumeSpecName "kube-api-access-lgprv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.783670 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data" (OuterVolumeSpecName: "config-data") pod "11185406-d5a7-4c9a-a85a-f944c255c20c" (UID: "11185406-d5a7-4c9a-a85a-f944c255c20c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.786254 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11185406-d5a7-4c9a-a85a-f944c255c20c" (UID: "11185406-d5a7-4c9a-a85a-f944c255c20c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.813834 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "11185406-d5a7-4c9a-a85a-f944c255c20c" (UID: "11185406-d5a7-4c9a-a85a-f944c255c20c"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.853419 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11185406-d5a7-4c9a-a85a-f944c255c20c-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.853460 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.853470 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgprv\" (UniqueName: \"kubernetes.io/projected/11185406-d5a7-4c9a-a85a-f944c255c20c-kube-api-access-lgprv\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.853479 4856 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.853493 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11185406-d5a7-4c9a-a85a-f944c255c20c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.931903 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.947166 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.955427 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.955871 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5f44803-700d-498a-819c-881f5959b477" containerName="nova-manage" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.955897 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5f44803-700d-498a-819c-881f5959b477" containerName="nova-manage" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.955915 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-log" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.955926 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-log" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.955955 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-metadata" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.955963 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-metadata" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.955977 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="init" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.955986 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="init" Mar 20 13:47:43 crc kubenswrapper[4856]: E0320 13:47:43.955999 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="dnsmasq-dns" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.956007 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="dnsmasq-dns" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.956219 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5f44803-700d-498a-819c-881f5959b477" containerName="nova-manage" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.956237 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-metadata" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.956258 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" containerName="nova-metadata-log" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.956295 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e146b98-057f-467e-994a-1fabff7911bd" containerName="dnsmasq-dns" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.957602 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.961709 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.961795 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:47:43 crc kubenswrapper[4856]: I0320 13:47:43.968529 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.060385 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.060449 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.060594 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66vpz\" (UniqueName: \"kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.060710 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.060728 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.162947 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.163029 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.163085 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66vpz\" (UniqueName: \"kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.163155 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.163199 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.163465 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.167715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.168468 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.170366 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.181689 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66vpz\" (UniqueName: \"kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz\") pod \"nova-metadata-0\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.286117 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:47:44 crc kubenswrapper[4856]: I0320 13:47:44.752755 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:47:44 crc kubenswrapper[4856]: W0320 13:47:44.758796 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31e7f358_22b0_4bdb_a685_f0009192fd33.slice/crio-a703602f20433b902155ebbe0a63e5ed4c1cdeb4813859b3e478e297731be702 WatchSource:0}: Error finding container a703602f20433b902155ebbe0a63e5ed4c1cdeb4813859b3e478e297731be702: Status 404 returned error can't find the container with id a703602f20433b902155ebbe0a63e5ed4c1cdeb4813859b3e478e297731be702 Mar 20 13:47:45 crc kubenswrapper[4856]: E0320 13:47:45.060034 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:47:45 crc kubenswrapper[4856]: E0320 13:47:45.062331 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:47:45 crc kubenswrapper[4856]: E0320 13:47:45.063542 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:47:45 crc kubenswrapper[4856]: E0320 13:47:45.063578 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerName="nova-scheduler-scheduler" Mar 20 13:47:45 crc kubenswrapper[4856]: I0320 13:47:45.624072 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerStarted","Data":"60a3d87fd01033f0a3d0e16a5be5e59a0968d15014710c0aa23ea179037d5fcc"} Mar 20 13:47:45 crc kubenswrapper[4856]: I0320 13:47:45.624475 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerStarted","Data":"3c96b0cee2fbdfe4534153f5a985af176373e829a592d6a22caefdab2de4ceef"} Mar 20 13:47:45 crc kubenswrapper[4856]: I0320 13:47:45.624490 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerStarted","Data":"a703602f20433b902155ebbe0a63e5ed4c1cdeb4813859b3e478e297731be702"} Mar 20 13:47:45 crc kubenswrapper[4856]: I0320 13:47:45.654785 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6547603090000003 podStartE2EDuration="2.654760309s" podCreationTimestamp="2026-03-20 13:47:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:45.640423723 +0000 UTC m=+1480.521449873" watchObservedRunningTime="2026-03-20 13:47:45.654760309 +0000 UTC m=+1480.535786479" Mar 20 13:47:45 crc kubenswrapper[4856]: I0320 13:47:45.838815 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11185406-d5a7-4c9a-a85a-f944c255c20c" path="/var/lib/kubelet/pods/11185406-d5a7-4c9a-a85a-f944c255c20c/volumes" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.309334 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.404998 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle\") pod \"3d3e255b-faee-4024-96dd-6e10ef862fb6\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.405214 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2pll\" (UniqueName: \"kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll\") pod \"3d3e255b-faee-4024-96dd-6e10ef862fb6\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.405515 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data\") pod \"3d3e255b-faee-4024-96dd-6e10ef862fb6\" (UID: \"3d3e255b-faee-4024-96dd-6e10ef862fb6\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.413843 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll" (OuterVolumeSpecName: "kube-api-access-p2pll") pod "3d3e255b-faee-4024-96dd-6e10ef862fb6" (UID: "3d3e255b-faee-4024-96dd-6e10ef862fb6"). InnerVolumeSpecName "kube-api-access-p2pll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.431868 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d3e255b-faee-4024-96dd-6e10ef862fb6" (UID: "3d3e255b-faee-4024-96dd-6e10ef862fb6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.464681 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data" (OuterVolumeSpecName: "config-data") pod "3d3e255b-faee-4024-96dd-6e10ef862fb6" (UID: "3d3e255b-faee-4024-96dd-6e10ef862fb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.508087 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.508130 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d3e255b-faee-4024-96dd-6e10ef862fb6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.508145 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2pll\" (UniqueName: \"kubernetes.io/projected/3d3e255b-faee-4024-96dd-6e10ef862fb6-kube-api-access-p2pll\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.526251 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.609047 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rncs\" (UniqueName: \"kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs\") pod \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.609303 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs\") pod \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.609337 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data\") pod \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.609422 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle\") pod \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\" (UID: \"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e\") " Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.610522 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs" (OuterVolumeSpecName: "logs") pod "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" (UID: "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.613778 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs" (OuterVolumeSpecName: "kube-api-access-9rncs") pod "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" (UID: "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e"). InnerVolumeSpecName "kube-api-access-9rncs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.633071 4856 generic.go:334] "Generic (PLEG): container finished" podID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" exitCode=0 Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.633132 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d3e255b-faee-4024-96dd-6e10ef862fb6","Type":"ContainerDied","Data":"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a"} Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.633158 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3d3e255b-faee-4024-96dd-6e10ef862fb6","Type":"ContainerDied","Data":"04761d07656a616fa7d0a9e1a46514d2d6df1de62a9b553703a38bf81c19988a"} Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.633175 4856 scope.go:117] "RemoveContainer" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.633312 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.637292 4856 generic.go:334] "Generic (PLEG): container finished" podID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerID="c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57" exitCode=0 Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.637626 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerDied","Data":"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57"} Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.637800 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e","Type":"ContainerDied","Data":"2f8e3e143c8496da6ccb2163dce114265bd091e65ac679ba41d513d9fed745ae"} Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.639342 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data" (OuterVolumeSpecName: "config-data") pod "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" (UID: "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.639819 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.646390 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" (UID: "bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.670193 4856 scope.go:117] "RemoveContainer" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.671154 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a\": container with ID starting with 9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a not found: ID does not exist" containerID="9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.671198 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a"} err="failed to get container status \"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a\": rpc error: code = NotFound desc = could not find container \"9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a\": container with ID starting with 9d72d57b5717e945f231ae2ccf79de2711954e69d3569b3cab05df08d953027a not found: ID does not exist" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.671224 4856 scope.go:117] "RemoveContainer" containerID="c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.673761 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.687436 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.696873 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.697389 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerName="nova-scheduler-scheduler" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697407 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerName="nova-scheduler-scheduler" Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.697438 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-api" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697447 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-api" Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.697467 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-log" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697474 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-log" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697716 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" containerName="nova-scheduler-scheduler" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697733 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-api" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.697753 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" containerName="nova-api-log" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.698478 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.701086 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.704746 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.726581 4856 scope.go:117] "RemoveContainer" containerID="d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.728185 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.728552 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rncs\" (UniqueName: \"kubernetes.io/projected/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-kube-api-access-9rncs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.728565 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.728573 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.745559 4856 scope.go:117] "RemoveContainer" containerID="c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57" Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.745973 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57\": container with ID starting with c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57 not found: ID does not exist" containerID="c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.746003 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57"} err="failed to get container status \"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57\": rpc error: code = NotFound desc = could not find container \"c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57\": container with ID starting with c5a6327a42dcdc963d46944b8184c5a784ad31c254a10e602d3e0ec725419c57 not found: ID does not exist" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.746024 4856 scope.go:117] "RemoveContainer" containerID="d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244" Mar 20 13:47:46 crc kubenswrapper[4856]: E0320 13:47:46.746312 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244\": container with ID starting with d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244 not found: ID does not exist" containerID="d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.746333 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244"} err="failed to get container status \"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244\": rpc error: code = NotFound desc = could not find container \"d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244\": container with ID starting with d8caac1fcb76493654d2deeb75b92f3390597406874bd8ab8589e7bcdb3ac244 not found: ID does not exist" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.830586 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.830652 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.830896 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxvv\" (UniqueName: \"kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.932665 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.932813 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxvv\" (UniqueName: \"kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.932919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.936675 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.939832 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:46 crc kubenswrapper[4856]: I0320 13:47:46.952176 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxvv\" (UniqueName: \"kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv\") pod \"nova-scheduler-0\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " pod="openstack/nova-scheduler-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.035570 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.044859 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.067721 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.075382 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.076900 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.080984 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.127789 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.242053 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.242297 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgtjx\" (UniqueName: \"kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.242331 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.242366 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.343812 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgtjx\" (UniqueName: \"kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.344147 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.344190 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.344296 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.344696 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.352034 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.352332 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.361720 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgtjx\" (UniqueName: \"kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx\") pod \"nova-api-0\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.482539 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.565815 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:47:47 crc kubenswrapper[4856]: W0320 13:47:47.569520 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40599482_130e_4649_a75a_9e7cc3891543.slice/crio-825a11dadd9eb0bd5de6ebc2be9a3377f97a5f723d930b46362ccc16ff68732d WatchSource:0}: Error finding container 825a11dadd9eb0bd5de6ebc2be9a3377f97a5f723d930b46362ccc16ff68732d: Status 404 returned error can't find the container with id 825a11dadd9eb0bd5de6ebc2be9a3377f97a5f723d930b46362ccc16ff68732d Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.663880 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40599482-130e-4649-a75a-9e7cc3891543","Type":"ContainerStarted","Data":"825a11dadd9eb0bd5de6ebc2be9a3377f97a5f723d930b46362ccc16ff68732d"} Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.838586 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d3e255b-faee-4024-96dd-6e10ef862fb6" path="/var/lib/kubelet/pods/3d3e255b-faee-4024-96dd-6e10ef862fb6/volumes" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.839231 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e" path="/var/lib/kubelet/pods/bfbc6ea7-9a2c-4e19-ba77-9e0c62b8b17e/volumes" Mar 20 13:47:47 crc kubenswrapper[4856]: I0320 13:47:47.862653 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.685340 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerStarted","Data":"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520"} Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.685754 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerStarted","Data":"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467"} Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.685777 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerStarted","Data":"e0eb9ca00b7533a07d8c0e2174b927652f07927203cb6d0a42c55b3ec7d50bbc"} Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.690041 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40599482-130e-4649-a75a-9e7cc3891543","Type":"ContainerStarted","Data":"1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5"} Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.704459 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7044406159999999 podStartE2EDuration="1.704440616s" podCreationTimestamp="2026-03-20 13:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:48.704191219 +0000 UTC m=+1483.585217359" watchObservedRunningTime="2026-03-20 13:47:48.704440616 +0000 UTC m=+1483.585466746" Mar 20 13:47:48 crc kubenswrapper[4856]: I0320 13:47:48.725991 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.725968776 podStartE2EDuration="2.725968776s" podCreationTimestamp="2026-03-20 13:47:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:48.724545986 +0000 UTC m=+1483.605572116" watchObservedRunningTime="2026-03-20 13:47:48.725968776 +0000 UTC m=+1483.606994906" Mar 20 13:47:49 crc kubenswrapper[4856]: I0320 13:47:49.703207 4856 generic.go:334] "Generic (PLEG): container finished" podID="6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" containerID="7c030a52fa49c1ad27d857f75abf652057f78f1ef66883ba32b85866d7b6baf7" exitCode=0 Mar 20 13:47:49 crc kubenswrapper[4856]: I0320 13:47:49.703604 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2ngth" event={"ID":"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0","Type":"ContainerDied","Data":"7c030a52fa49c1ad27d857f75abf652057f78f1ef66883ba32b85866d7b6baf7"} Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.100765 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.259772 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts\") pod \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.259872 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data\") pod \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.259965 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkbks\" (UniqueName: \"kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks\") pod \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.259999 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle\") pod \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\" (UID: \"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0\") " Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.265588 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks" (OuterVolumeSpecName: "kube-api-access-jkbks") pod "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" (UID: "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0"). InnerVolumeSpecName "kube-api-access-jkbks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.266221 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts" (OuterVolumeSpecName: "scripts") pod "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" (UID: "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.283928 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data" (OuterVolumeSpecName: "config-data") pod "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" (UID: "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.300838 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" (UID: "6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.361912 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.361943 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.361952 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkbks\" (UniqueName: \"kubernetes.io/projected/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-kube-api-access-jkbks\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.361964 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.726792 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2ngth" event={"ID":"6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0","Type":"ContainerDied","Data":"0ffc7a1d0d72910ed0d475ce38a19892a17d5c9cf6ad09cdcebdcb5ba828fd85"} Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.726828 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ffc7a1d0d72910ed0d475ce38a19892a17d5c9cf6ad09cdcebdcb5ba828fd85" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.727391 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2ngth" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.834447 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:47:51 crc kubenswrapper[4856]: E0320 13:47:51.834820 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" containerName="nova-cell1-conductor-db-sync" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.834845 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" containerName="nova-cell1-conductor-db-sync" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.835061 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" containerName="nova-cell1-conductor-db-sync" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.835926 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.838802 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.854678 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.874468 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.874559 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.874851 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdtsr\" (UniqueName: \"kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.976066 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdtsr\" (UniqueName: \"kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.976423 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.976721 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.981694 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.983325 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:51 crc kubenswrapper[4856]: I0320 13:47:51.999018 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdtsr\" (UniqueName: \"kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr\") pod \"nova-cell1-conductor-0\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4856]: I0320 13:47:52.036819 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:47:52 crc kubenswrapper[4856]: I0320 13:47:52.155250 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4856]: I0320 13:47:52.698577 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:47:52 crc kubenswrapper[4856]: I0320 13:47:52.739797 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7657bdc0-a1e5-4421-aceb-8cd410fc0226","Type":"ContainerStarted","Data":"49d0def3316bc781556684d21a232d2b5373089f3eec9874cb227de47773b82b"} Mar 20 13:47:53 crc kubenswrapper[4856]: I0320 13:47:53.752954 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7657bdc0-a1e5-4421-aceb-8cd410fc0226","Type":"ContainerStarted","Data":"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd"} Mar 20 13:47:53 crc kubenswrapper[4856]: I0320 13:47:53.753481 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:53 crc kubenswrapper[4856]: I0320 13:47:53.778927 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.778908259 podStartE2EDuration="2.778908259s" podCreationTimestamp="2026-03-20 13:47:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:53.773634989 +0000 UTC m=+1488.654661119" watchObservedRunningTime="2026-03-20 13:47:53.778908259 +0000 UTC m=+1488.659934399" Mar 20 13:47:54 crc kubenswrapper[4856]: I0320 13:47:54.286943 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:47:54 crc kubenswrapper[4856]: I0320 13:47:54.287001 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:47:55 crc kubenswrapper[4856]: I0320 13:47:55.303533 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:55 crc kubenswrapper[4856]: I0320 13:47:55.303545 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.035894 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.079639 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.189468 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.483427 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.483717 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:47:57 crc kubenswrapper[4856]: I0320 13:47:57.816525 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:47:58 crc kubenswrapper[4856]: I0320 13:47:58.565540 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:58 crc kubenswrapper[4856]: I0320 13:47:58.565564 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.147447 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dfsv2"] Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.149259 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.151406 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.151618 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.151858 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.167840 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dfsv2"] Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.266874 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw96m\" (UniqueName: \"kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m\") pod \"auto-csr-approver-29566908-dfsv2\" (UID: \"facc5117-d669-458c-b9fc-33d0e67c4610\") " pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.371176 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw96m\" (UniqueName: \"kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m\") pod \"auto-csr-approver-29566908-dfsv2\" (UID: \"facc5117-d669-458c-b9fc-33d0e67c4610\") " pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.409216 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw96m\" (UniqueName: \"kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m\") pod \"auto-csr-approver-29566908-dfsv2\" (UID: \"facc5117-d669-458c-b9fc-33d0e67c4610\") " pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.472369 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:00 crc kubenswrapper[4856]: I0320 13:48:00.903995 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dfsv2"] Mar 20 13:48:01 crc kubenswrapper[4856]: I0320 13:48:01.833179 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" event={"ID":"facc5117-d669-458c-b9fc-33d0e67c4610","Type":"ContainerStarted","Data":"84de21f030163cc08ce6bf8b7d667c60cd620dff723481d8dc7904e59dbe74a3"} Mar 20 13:48:02 crc kubenswrapper[4856]: I0320 13:48:02.286260 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:48:02 crc kubenswrapper[4856]: I0320 13:48:02.286706 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:48:04 crc kubenswrapper[4856]: I0320 13:48:04.291938 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:48:04 crc kubenswrapper[4856]: I0320 13:48:04.295179 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:48:04 crc kubenswrapper[4856]: I0320 13:48:04.296675 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:48:04 crc kubenswrapper[4856]: I0320 13:48:04.868175 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.483419 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.484492 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.765624 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.798617 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data\") pod \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.798677 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx5dl\" (UniqueName: \"kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl\") pod \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.798864 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle\") pod \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\" (UID: \"cb9ec39c-3fd5-4477-9868-44e5424f9bb3\") " Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.806187 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl" (OuterVolumeSpecName: "kube-api-access-dx5dl") pod "cb9ec39c-3fd5-4477-9868-44e5424f9bb3" (UID: "cb9ec39c-3fd5-4477-9868-44e5424f9bb3"). InnerVolumeSpecName "kube-api-access-dx5dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.833550 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data" (OuterVolumeSpecName: "config-data") pod "cb9ec39c-3fd5-4477-9868-44e5424f9bb3" (UID: "cb9ec39c-3fd5-4477-9868-44e5424f9bb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.846335 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb9ec39c-3fd5-4477-9868-44e5424f9bb3" (UID: "cb9ec39c-3fd5-4477-9868-44e5424f9bb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.869806 4856 generic.go:334] "Generic (PLEG): container finished" podID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" containerID="b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169" exitCode=137 Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.870663 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.872522 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb9ec39c-3fd5-4477-9868-44e5424f9bb3","Type":"ContainerDied","Data":"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169"} Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.872568 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cb9ec39c-3fd5-4477-9868-44e5424f9bb3","Type":"ContainerDied","Data":"c500d3df8081746eaa449528a230fd690e9fb4c5c929194cc7f481f7c8f43e57"} Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.872592 4856 scope.go:117] "RemoveContainer" containerID="b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.901182 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.901207 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx5dl\" (UniqueName: \"kubernetes.io/projected/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-kube-api-access-dx5dl\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.901216 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb9ec39c-3fd5-4477-9868-44e5424f9bb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.925681 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.937654 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.952480 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:05 crc kubenswrapper[4856]: E0320 13:48:05.953009 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.953034 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.953261 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.954113 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.956724 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.956849 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.957019 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.957409 4856 scope.go:117] "RemoveContainer" containerID="b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.960550 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:05 crc kubenswrapper[4856]: E0320 13:48:05.962826 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169\": container with ID starting with b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169 not found: ID does not exist" containerID="b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169" Mar 20 13:48:05 crc kubenswrapper[4856]: I0320 13:48:05.962869 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169"} err="failed to get container status \"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169\": rpc error: code = NotFound desc = could not find container \"b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169\": container with ID starting with b387f028380a3e42e16d4ad6c22aeb19da8d75884dcffe59984fb8aa42c93169 not found: ID does not exist" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.003171 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.003223 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.003356 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.003380 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.003443 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcgr\" (UniqueName: \"kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.105798 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.105833 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.105880 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcgr\" (UniqueName: \"kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.105954 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.105983 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.110183 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.110334 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.111769 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.113176 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.128753 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcgr\" (UniqueName: \"kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr\") pod \"nova-cell1-novncproxy-0\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.275486 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.760983 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:06 crc kubenswrapper[4856]: W0320 13:48:06.772171 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00a8314e_4faf_4926_82f2_35c25154a7b5.slice/crio-48bab3d6f0381fcb45d9eec3b168847a0ffbdc483bed2994febfd468a2615d0d WatchSource:0}: Error finding container 48bab3d6f0381fcb45d9eec3b168847a0ffbdc483bed2994febfd468a2615d0d: Status 404 returned error can't find the container with id 48bab3d6f0381fcb45d9eec3b168847a0ffbdc483bed2994febfd468a2615d0d Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.890917 4856 generic.go:334] "Generic (PLEG): container finished" podID="facc5117-d669-458c-b9fc-33d0e67c4610" containerID="8d752ab76f5a06d13b2f52ba3237d5019795b8648083187e81109c9309f5766b" exitCode=0 Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.891077 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" event={"ID":"facc5117-d669-458c-b9fc-33d0e67c4610","Type":"ContainerDied","Data":"8d752ab76f5a06d13b2f52ba3237d5019795b8648083187e81109c9309f5766b"} Mar 20 13:48:06 crc kubenswrapper[4856]: I0320 13:48:06.894174 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a8314e-4faf-4926-82f2-35c25154a7b5","Type":"ContainerStarted","Data":"48bab3d6f0381fcb45d9eec3b168847a0ffbdc483bed2994febfd468a2615d0d"} Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.489142 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.489654 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.492566 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.498691 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.706959 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.711883 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.723227 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.755859 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.755925 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.755960 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.755997 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcfvx\" (UniqueName: \"kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.756077 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.756099 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.834878 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb9ec39c-3fd5-4477-9868-44e5424f9bb3" path="/var/lib/kubelet/pods/cb9ec39c-3fd5-4477-9868-44e5424f9bb3/volumes" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.857971 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858024 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858049 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858098 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcfvx\" (UniqueName: \"kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858176 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858193 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.858978 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.859892 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.859902 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.860456 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.860984 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.883144 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcfvx\" (UniqueName: \"kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx\") pod \"dnsmasq-dns-89c5cd4d5-4wp22\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:07 crc kubenswrapper[4856]: I0320 13:48:07.907865 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a8314e-4faf-4926-82f2-35c25154a7b5","Type":"ContainerStarted","Data":"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef"} Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.031830 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.417497 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.434688 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.434668196 podStartE2EDuration="3.434668196s" podCreationTimestamp="2026-03-20 13:48:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:07.979239096 +0000 UTC m=+1502.860265226" watchObservedRunningTime="2026-03-20 13:48:08.434668196 +0000 UTC m=+1503.315694326" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.469764 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw96m\" (UniqueName: \"kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m\") pod \"facc5117-d669-458c-b9fc-33d0e67c4610\" (UID: \"facc5117-d669-458c-b9fc-33d0e67c4610\") " Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.474149 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m" (OuterVolumeSpecName: "kube-api-access-fw96m") pod "facc5117-d669-458c-b9fc-33d0e67c4610" (UID: "facc5117-d669-458c-b9fc-33d0e67c4610"). InnerVolumeSpecName "kube-api-access-fw96m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.570103 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.571904 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw96m\" (UniqueName: \"kubernetes.io/projected/facc5117-d669-458c-b9fc-33d0e67c4610-kube-api-access-fw96m\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.918938 4856 generic.go:334] "Generic (PLEG): container finished" podID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerID="6fb4d0d72db1e2865ac852c989e6934a1228985a148c4cfeae18d2d130395973" exitCode=0 Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.919017 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" event={"ID":"636ad94a-b2ac-42c8-b83d-063d66cfeaf8","Type":"ContainerDied","Data":"6fb4d0d72db1e2865ac852c989e6934a1228985a148c4cfeae18d2d130395973"} Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.919389 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" event={"ID":"636ad94a-b2ac-42c8-b83d-063d66cfeaf8","Type":"ContainerStarted","Data":"14a75f74f0b2d3502e4b9ac703c52a1b8bafd4d32171f2d5f26b9fc3cd75bff1"} Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.921787 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" event={"ID":"facc5117-d669-458c-b9fc-33d0e67c4610","Type":"ContainerDied","Data":"84de21f030163cc08ce6bf8b7d667c60cd620dff723481d8dc7904e59dbe74a3"} Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.921830 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84de21f030163cc08ce6bf8b7d667c60cd620dff723481d8dc7904e59dbe74a3" Mar 20 13:48:08 crc kubenswrapper[4856]: I0320 13:48:08.921988 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dfsv2" Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.497197 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-8j2v4"] Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.515494 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-8j2v4"] Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.840821 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b72b040-1c32-472d-b5e1-8ee3a7ace646" path="/var/lib/kubelet/pods/4b72b040-1c32-472d-b5e1-8ee3a7ace646/volumes" Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.930582 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" event={"ID":"636ad94a-b2ac-42c8-b83d-063d66cfeaf8","Type":"ContainerStarted","Data":"de349a9fac18ad1354c9d674041dc5baa22d5b5162beba99ee8edb004fcba1ea"} Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.931820 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.954937 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" podStartSLOduration=2.95491762 podStartE2EDuration="2.95491762s" podCreationTimestamp="2026-03-20 13:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:09.949866446 +0000 UTC m=+1504.830892576" watchObservedRunningTime="2026-03-20 13:48:09.95491762 +0000 UTC m=+1504.835943750" Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.987604 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:09 crc kubenswrapper[4856]: I0320 13:48:09.987667 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.080297 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.080568 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-central-agent" containerID="cri-o://82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.080654 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="sg-core" containerID="cri-o://af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.080661 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="proxy-httpd" containerID="cri-o://62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.080682 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-notification-agent" containerID="cri-o://b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.598143 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.598670 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-log" containerID="cri-o://11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.598812 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-api" containerID="cri-o://e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520" gracePeriod=30 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.941051 4856 generic.go:334] "Generic (PLEG): container finished" podID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerID="11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467" exitCode=143 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.941125 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerDied","Data":"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467"} Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943707 4856 generic.go:334] "Generic (PLEG): container finished" podID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerID="62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758" exitCode=0 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943724 4856 generic.go:334] "Generic (PLEG): container finished" podID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerID="af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778" exitCode=2 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943734 4856 generic.go:334] "Generic (PLEG): container finished" podID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerID="82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de" exitCode=0 Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943782 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerDied","Data":"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758"} Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943814 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerDied","Data":"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778"} Mar 20 13:48:10 crc kubenswrapper[4856]: I0320 13:48:10.943840 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerDied","Data":"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de"} Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.276663 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.778112 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838668 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838711 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838836 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838863 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838942 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838969 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.838991 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x96zx\" (UniqueName: \"kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.839031 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts\") pod \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\" (UID: \"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03\") " Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.841696 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.841765 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.844810 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts" (OuterVolumeSpecName: "scripts") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.871475 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx" (OuterVolumeSpecName: "kube-api-access-x96zx") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "kube-api-access-x96zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.871603 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.910485 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.928129 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941859 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941896 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941910 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941923 4856 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941937 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941948 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x96zx\" (UniqueName: \"kubernetes.io/projected/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-kube-api-access-x96zx\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.941961 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.956753 4856 generic.go:334] "Generic (PLEG): container finished" podID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerID="b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e" exitCode=0 Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.956833 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.957006 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerDied","Data":"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e"} Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.957125 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c5ebb05-0f0f-46ba-b5a0-f225973b4e03","Type":"ContainerDied","Data":"62003e4a049c4400fda9b02f1372627e59526763c170c75405b9b0d26892927d"} Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.957198 4856 scope.go:117] "RemoveContainer" containerID="62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.973922 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data" (OuterVolumeSpecName: "config-data") pod "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" (UID: "6c5ebb05-0f0f-46ba-b5a0-f225973b4e03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:11 crc kubenswrapper[4856]: I0320 13:48:11.984393 4856 scope.go:117] "RemoveContainer" containerID="af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.003655 4856 scope.go:117] "RemoveContainer" containerID="b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.019467 4856 scope.go:117] "RemoveContainer" containerID="82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.039697 4856 scope.go:117] "RemoveContainer" containerID="62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.040245 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758\": container with ID starting with 62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758 not found: ID does not exist" containerID="62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.040360 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758"} err="failed to get container status \"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758\": rpc error: code = NotFound desc = could not find container \"62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758\": container with ID starting with 62589ab8a988ab7f2cf5021f94b0d35567dced9f73ee45915551b4cdd886d758 not found: ID does not exist" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.040437 4856 scope.go:117] "RemoveContainer" containerID="af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.041314 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778\": container with ID starting with af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778 not found: ID does not exist" containerID="af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.041383 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778"} err="failed to get container status \"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778\": rpc error: code = NotFound desc = could not find container \"af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778\": container with ID starting with af2a60abce1689afab211756f5b7358ef9c159c3348cc5ef4479d791826db778 not found: ID does not exist" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.041410 4856 scope.go:117] "RemoveContainer" containerID="b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.041798 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e\": container with ID starting with b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e not found: ID does not exist" containerID="b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.041884 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e"} err="failed to get container status \"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e\": rpc error: code = NotFound desc = could not find container \"b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e\": container with ID starting with b9c739ab6d66ed0b3102f5f82f53607e85aae88b3cab3eea2f5903481ddf220e not found: ID does not exist" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.041909 4856 scope.go:117] "RemoveContainer" containerID="82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.042249 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de\": container with ID starting with 82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de not found: ID does not exist" containerID="82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.042308 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de"} err="failed to get container status \"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de\": rpc error: code = NotFound desc = could not find container \"82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de\": container with ID starting with 82a5766e920d5ca5c82f24409c23d78fb3f3fe7cb3d59fe1f83a0e0a1ab3c3de not found: ID does not exist" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.050306 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.293503 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.306147 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316293 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.316715 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-central-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316734 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-central-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.316750 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="proxy-httpd" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316756 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="proxy-httpd" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.316774 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-notification-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316780 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-notification-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.316799 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="sg-core" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316804 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="sg-core" Mar 20 13:48:12 crc kubenswrapper[4856]: E0320 13:48:12.316811 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="facc5117-d669-458c-b9fc-33d0e67c4610" containerName="oc" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316817 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="facc5117-d669-458c-b9fc-33d0e67c4610" containerName="oc" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316969 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-notification-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316986 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="sg-core" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.316994 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="facc5117-d669-458c-b9fc-33d0e67c4610" containerName="oc" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.317002 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="ceilometer-central-agent" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.317013 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" containerName="proxy-httpd" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.318619 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.322952 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.323356 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.324399 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.337982 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.456826 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.456922 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghfzk\" (UniqueName: \"kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.456961 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.457042 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.457236 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.457600 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.457751 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.457925 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.559629 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghfzk\" (UniqueName: \"kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.560147 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.561198 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.561638 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.562007 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.562226 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.563188 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.565711 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.566799 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.567405 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.568494 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.568872 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.569520 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.573568 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.576683 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.582081 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghfzk\" (UniqueName: \"kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk\") pod \"ceilometer-0\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " pod="openstack/ceilometer-0" Mar 20 13:48:12 crc kubenswrapper[4856]: I0320 13:48:12.639097 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:13 crc kubenswrapper[4856]: W0320 13:48:13.120797 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod050eefc7_c113_4198_b1ad_0645ad765a2a.slice/crio-0623e18d53c925ec76df1668374aef11ee1ac56a741346455b4cf199d417a6b9 WatchSource:0}: Error finding container 0623e18d53c925ec76df1668374aef11ee1ac56a741346455b4cf199d417a6b9: Status 404 returned error can't find the container with id 0623e18d53c925ec76df1668374aef11ee1ac56a741346455b4cf199d417a6b9 Mar 20 13:48:13 crc kubenswrapper[4856]: I0320 13:48:13.124619 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:13 crc kubenswrapper[4856]: I0320 13:48:13.125909 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:48:13 crc kubenswrapper[4856]: I0320 13:48:13.833748 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c5ebb05-0f0f-46ba-b5a0-f225973b4e03" path="/var/lib/kubelet/pods/6c5ebb05-0f0f-46ba-b5a0-f225973b4e03/volumes" Mar 20 13:48:13 crc kubenswrapper[4856]: I0320 13:48:13.995259 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerStarted","Data":"0623e18d53c925ec76df1668374aef11ee1ac56a741346455b4cf199d417a6b9"} Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.388688 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.509340 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle\") pod \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.509489 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data\") pod \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.509526 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs\") pod \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.509605 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgtjx\" (UniqueName: \"kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx\") pod \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\" (UID: \"743ddfe5-ed86-4559-b73b-e0b61d4412bd\") " Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.511140 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs" (OuterVolumeSpecName: "logs") pod "743ddfe5-ed86-4559-b73b-e0b61d4412bd" (UID: "743ddfe5-ed86-4559-b73b-e0b61d4412bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.513759 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/743ddfe5-ed86-4559-b73b-e0b61d4412bd-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.523887 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx" (OuterVolumeSpecName: "kube-api-access-kgtjx") pod "743ddfe5-ed86-4559-b73b-e0b61d4412bd" (UID: "743ddfe5-ed86-4559-b73b-e0b61d4412bd"). InnerVolumeSpecName "kube-api-access-kgtjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.542966 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "743ddfe5-ed86-4559-b73b-e0b61d4412bd" (UID: "743ddfe5-ed86-4559-b73b-e0b61d4412bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.570341 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data" (OuterVolumeSpecName: "config-data") pod "743ddfe5-ed86-4559-b73b-e0b61d4412bd" (UID: "743ddfe5-ed86-4559-b73b-e0b61d4412bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.616045 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.616091 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgtjx\" (UniqueName: \"kubernetes.io/projected/743ddfe5-ed86-4559-b73b-e0b61d4412bd-kube-api-access-kgtjx\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:14 crc kubenswrapper[4856]: I0320 13:48:14.616106 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/743ddfe5-ed86-4559-b73b-e0b61d4412bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.006111 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerStarted","Data":"1921c9edd328e3c015ebb3e3c66b2a013cbe2bfbc222ca92d31d2027cab3c79d"} Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.008543 4856 generic.go:334] "Generic (PLEG): container finished" podID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerID="e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520" exitCode=0 Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.008575 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerDied","Data":"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520"} Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.008597 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"743ddfe5-ed86-4559-b73b-e0b61d4412bd","Type":"ContainerDied","Data":"e0eb9ca00b7533a07d8c0e2174b927652f07927203cb6d0a42c55b3ec7d50bbc"} Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.008617 4856 scope.go:117] "RemoveContainer" containerID="e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.008747 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.049891 4856 scope.go:117] "RemoveContainer" containerID="11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.058674 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.072101 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.078134 4856 scope.go:117] "RemoveContainer" containerID="e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520" Mar 20 13:48:15 crc kubenswrapper[4856]: E0320 13:48:15.079389 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520\": container with ID starting with e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520 not found: ID does not exist" containerID="e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.079452 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520"} err="failed to get container status \"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520\": rpc error: code = NotFound desc = could not find container \"e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520\": container with ID starting with e99f29e53b270fae3656b3156c49cb9f5064d34643811e47ad69a28b37bc9520 not found: ID does not exist" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.079484 4856 scope.go:117] "RemoveContainer" containerID="11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467" Mar 20 13:48:15 crc kubenswrapper[4856]: E0320 13:48:15.082134 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467\": container with ID starting with 11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467 not found: ID does not exist" containerID="11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.082179 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467"} err="failed to get container status \"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467\": rpc error: code = NotFound desc = could not find container \"11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467\": container with ID starting with 11ca9df9143241fd2516eb89a3c6a6d7f2bf1ad7b4d5f33199752925b9ae1467 not found: ID does not exist" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.091931 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:15 crc kubenswrapper[4856]: E0320 13:48:15.092451 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-log" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.092493 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-log" Mar 20 13:48:15 crc kubenswrapper[4856]: E0320 13:48:15.092539 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-api" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.092549 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-api" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.092769 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-log" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.092794 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" containerName="nova-api-api" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.093921 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.096838 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.097342 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.098527 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.123166 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226552 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226629 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226671 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226727 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226784 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.226847 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2vh8\" (UniqueName: \"kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329141 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329214 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329252 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329307 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329363 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329406 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2vh8\" (UniqueName: \"kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.329605 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.335308 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.336085 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.346087 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.346093 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.349542 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2vh8\" (UniqueName: \"kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8\") pod \"nova-api-0\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.411796 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.848932 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="743ddfe5-ed86-4559-b73b-e0b61d4412bd" path="/var/lib/kubelet/pods/743ddfe5-ed86-4559-b73b-e0b61d4412bd/volumes" Mar 20 13:48:15 crc kubenswrapper[4856]: I0320 13:48:15.954436 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:15 crc kubenswrapper[4856]: W0320 13:48:15.968513 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c02093c_9927_4e80_9077_f6a0390c6721.slice/crio-4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e WatchSource:0}: Error finding container 4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e: Status 404 returned error can't find the container with id 4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e Mar 20 13:48:16 crc kubenswrapper[4856]: I0320 13:48:16.017116 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerStarted","Data":"4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e"} Mar 20 13:48:16 crc kubenswrapper[4856]: I0320 13:48:16.276471 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:16 crc kubenswrapper[4856]: I0320 13:48:16.300180 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.032354 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerStarted","Data":"8baa4310240d002ce370f1938841d57f264431b93149fa8dcd6abcd5b71b8287"} Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.032658 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerStarted","Data":"776beaea3c9377bde7e78faa04866103da0c3428a4e7fd231cab6e96b2a77971"} Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.037121 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerStarted","Data":"dfcea8e8e09281bcfc96738b570f941e3374939aa32a47165852731b6ed71dea"} Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.037155 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerStarted","Data":"d27f4b299032daa64a1c2fbb0a6497ce48fae2970eb08d16566671f428d9e3ea"} Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.053393 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.077117 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.077091825 podStartE2EDuration="2.077091825s" podCreationTimestamp="2026-03-20 13:48:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:17.064024175 +0000 UTC m=+1511.945050305" watchObservedRunningTime="2026-03-20 13:48:17.077091825 +0000 UTC m=+1511.958117955" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.216119 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-47jfh"] Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.217216 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.220980 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.221840 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.235649 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-47jfh"] Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.268237 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.268320 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.268349 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phx6s\" (UniqueName: \"kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.268426 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.370319 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.370369 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.370399 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phx6s\" (UniqueName: \"kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.370470 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.378439 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.378559 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.381519 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.389554 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phx6s\" (UniqueName: \"kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s\") pod \"nova-cell1-cell-mapping-47jfh\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:17 crc kubenswrapper[4856]: I0320 13:48:17.543698 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.015057 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-47jfh"] Mar 20 13:48:18 crc kubenswrapper[4856]: W0320 13:48:18.028176 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod667e1b8e_28bc_4227_8b0b_f0195587213f.slice/crio-6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5 WatchSource:0}: Error finding container 6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5: Status 404 returned error can't find the container with id 6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5 Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.034483 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.047766 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-47jfh" event={"ID":"667e1b8e-28bc-4227-8b0b-f0195587213f","Type":"ContainerStarted","Data":"6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5"} Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.128577 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.128800 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="dnsmasq-dns" containerID="cri-o://268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab" gracePeriod=10 Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.815376 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906226 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906399 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmhgj\" (UniqueName: \"kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906503 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906606 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906651 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.906714 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc\") pod \"204d3eaf-06d1-4c05-a2a9-01d424229125\" (UID: \"204d3eaf-06d1-4c05-a2a9-01d424229125\") " Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.914903 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj" (OuterVolumeSpecName: "kube-api-access-nmhgj") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "kube-api-access-nmhgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.915894 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmhgj\" (UniqueName: \"kubernetes.io/projected/204d3eaf-06d1-4c05-a2a9-01d424229125-kube-api-access-nmhgj\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.988951 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.993483 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4856]: I0320 13:48:18.997520 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.006601 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config" (OuterVolumeSpecName: "config") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.008153 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "204d3eaf-06d1-4c05-a2a9-01d424229125" (UID: "204d3eaf-06d1-4c05-a2a9-01d424229125"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.019534 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.019562 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.019572 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.019583 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.019591 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204d3eaf-06d1-4c05-a2a9-01d424229125-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.059809 4856 generic.go:334] "Generic (PLEG): container finished" podID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerID="268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab" exitCode=0 Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.059883 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" event={"ID":"204d3eaf-06d1-4c05-a2a9-01d424229125","Type":"ContainerDied","Data":"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab"} Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.059918 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" event={"ID":"204d3eaf-06d1-4c05-a2a9-01d424229125","Type":"ContainerDied","Data":"0920b89e71ad359a91c5bf5dfe6a6188d533a141b05a0c055d3bd315d49b6798"} Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.059939 4856 scope.go:117] "RemoveContainer" containerID="268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.059928 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-z6d2g" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.082656 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerStarted","Data":"88dd600e0e2eead445603b77f3af33d71b91561cfdbeb4eb78f38a3d301378d1"} Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.083513 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.115005 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-47jfh" event={"ID":"667e1b8e-28bc-4227-8b0b-f0195587213f","Type":"ContainerStarted","Data":"1927ae53faa128939cfcd70a23a0ae912818272307357559a80053d78b78317f"} Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.118971 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.655541675 podStartE2EDuration="7.118948544s" podCreationTimestamp="2026-03-20 13:48:12 +0000 UTC" firstStartedPulling="2026-03-20 13:48:13.125440579 +0000 UTC m=+1508.006466729" lastFinishedPulling="2026-03-20 13:48:18.588847468 +0000 UTC m=+1513.469873598" observedRunningTime="2026-03-20 13:48:19.114072146 +0000 UTC m=+1513.995098306" watchObservedRunningTime="2026-03-20 13:48:19.118948544 +0000 UTC m=+1513.999974674" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.129416 4856 scope.go:117] "RemoveContainer" containerID="834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.146167 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-47jfh" podStartSLOduration=2.146148084 podStartE2EDuration="2.146148084s" podCreationTimestamp="2026-03-20 13:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:19.145754714 +0000 UTC m=+1514.026780864" watchObservedRunningTime="2026-03-20 13:48:19.146148084 +0000 UTC m=+1514.027174234" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.164039 4856 scope.go:117] "RemoveContainer" containerID="268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab" Mar 20 13:48:19 crc kubenswrapper[4856]: E0320 13:48:19.164586 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab\": container with ID starting with 268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab not found: ID does not exist" containerID="268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.164616 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab"} err="failed to get container status \"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab\": rpc error: code = NotFound desc = could not find container \"268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab\": container with ID starting with 268730b995659d028a04dfc43ddafd9abe2b0fecefc6be471c35f622be69baab not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.164637 4856 scope.go:117] "RemoveContainer" containerID="834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8" Mar 20 13:48:19 crc kubenswrapper[4856]: E0320 13:48:19.165009 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8\": container with ID starting with 834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8 not found: ID does not exist" containerID="834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.165040 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8"} err="failed to get container status \"834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8\": rpc error: code = NotFound desc = could not find container \"834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8\": container with ID starting with 834a62dcd3d8e10b85c141ec72668884225e615bcb7b4adca021a79538c99aa8 not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.173978 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.186308 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-z6d2g"] Mar 20 13:48:19 crc kubenswrapper[4856]: I0320 13:48:19.830848 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" path="/var/lib/kubelet/pods/204d3eaf-06d1-4c05-a2a9-01d424229125/volumes" Mar 20 13:48:24 crc kubenswrapper[4856]: I0320 13:48:24.191180 4856 generic.go:334] "Generic (PLEG): container finished" podID="667e1b8e-28bc-4227-8b0b-f0195587213f" containerID="1927ae53faa128939cfcd70a23a0ae912818272307357559a80053d78b78317f" exitCode=0 Mar 20 13:48:24 crc kubenswrapper[4856]: I0320 13:48:24.191337 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-47jfh" event={"ID":"667e1b8e-28bc-4227-8b0b-f0195587213f","Type":"ContainerDied","Data":"1927ae53faa128939cfcd70a23a0ae912818272307357559a80053d78b78317f"} Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.412975 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.413055 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.661048 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.777712 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data\") pod \"667e1b8e-28bc-4227-8b0b-f0195587213f\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.777801 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phx6s\" (UniqueName: \"kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s\") pod \"667e1b8e-28bc-4227-8b0b-f0195587213f\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.777826 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts\") pod \"667e1b8e-28bc-4227-8b0b-f0195587213f\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.777902 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle\") pod \"667e1b8e-28bc-4227-8b0b-f0195587213f\" (UID: \"667e1b8e-28bc-4227-8b0b-f0195587213f\") " Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.783379 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts" (OuterVolumeSpecName: "scripts") pod "667e1b8e-28bc-4227-8b0b-f0195587213f" (UID: "667e1b8e-28bc-4227-8b0b-f0195587213f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.784315 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s" (OuterVolumeSpecName: "kube-api-access-phx6s") pod "667e1b8e-28bc-4227-8b0b-f0195587213f" (UID: "667e1b8e-28bc-4227-8b0b-f0195587213f"). InnerVolumeSpecName "kube-api-access-phx6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.812851 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "667e1b8e-28bc-4227-8b0b-f0195587213f" (UID: "667e1b8e-28bc-4227-8b0b-f0195587213f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.815836 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data" (OuterVolumeSpecName: "config-data") pod "667e1b8e-28bc-4227-8b0b-f0195587213f" (UID: "667e1b8e-28bc-4227-8b0b-f0195587213f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.881131 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.881203 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phx6s\" (UniqueName: \"kubernetes.io/projected/667e1b8e-28bc-4227-8b0b-f0195587213f-kube-api-access-phx6s\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.881214 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:25 crc kubenswrapper[4856]: I0320 13:48:25.881227 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/667e1b8e-28bc-4227-8b0b-f0195587213f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.221088 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-47jfh" event={"ID":"667e1b8e-28bc-4227-8b0b-f0195587213f","Type":"ContainerDied","Data":"6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5"} Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.221125 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb1d33c593db9af676afc0a0633890348de988b879a4b65e5f5a2cb11a493f5" Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.221173 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-47jfh" Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.427520 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.427528 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.206:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.484884 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.485111 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-log" containerID="cri-o://d27f4b299032daa64a1c2fbb0a6497ce48fae2970eb08d16566671f428d9e3ea" gracePeriod=30 Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.485534 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-api" containerID="cri-o://dfcea8e8e09281bcfc96738b570f941e3374939aa32a47165852731b6ed71dea" gracePeriod=30 Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.551481 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.551836 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="40599482-130e-4649-a75a-9e7cc3891543" containerName="nova-scheduler-scheduler" containerID="cri-o://1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" gracePeriod=30 Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.637363 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.637666 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-log" containerID="cri-o://3c96b0cee2fbdfe4534153f5a985af176373e829a592d6a22caefdab2de4ceef" gracePeriod=30 Mar 20 13:48:26 crc kubenswrapper[4856]: I0320 13:48:26.637728 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-metadata" containerID="cri-o://60a3d87fd01033f0a3d0e16a5be5e59a0968d15014710c0aa23ea179037d5fcc" gracePeriod=30 Mar 20 13:48:27 crc kubenswrapper[4856]: E0320 13:48:27.037377 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:48:27 crc kubenswrapper[4856]: E0320 13:48:27.038664 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:48:27 crc kubenswrapper[4856]: E0320 13:48:27.039731 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:48:27 crc kubenswrapper[4856]: E0320 13:48:27.039770 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="40599482-130e-4649-a75a-9e7cc3891543" containerName="nova-scheduler-scheduler" Mar 20 13:48:27 crc kubenswrapper[4856]: I0320 13:48:27.231824 4856 generic.go:334] "Generic (PLEG): container finished" podID="8c02093c-9927-4e80-9077-f6a0390c6721" containerID="d27f4b299032daa64a1c2fbb0a6497ce48fae2970eb08d16566671f428d9e3ea" exitCode=143 Mar 20 13:48:27 crc kubenswrapper[4856]: I0320 13:48:27.231889 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerDied","Data":"d27f4b299032daa64a1c2fbb0a6497ce48fae2970eb08d16566671f428d9e3ea"} Mar 20 13:48:27 crc kubenswrapper[4856]: I0320 13:48:27.234534 4856 generic.go:334] "Generic (PLEG): container finished" podID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerID="3c96b0cee2fbdfe4534153f5a985af176373e829a592d6a22caefdab2de4ceef" exitCode=143 Mar 20 13:48:27 crc kubenswrapper[4856]: I0320 13:48:27.234563 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerDied","Data":"3c96b0cee2fbdfe4534153f5a985af176373e829a592d6a22caefdab2de4ceef"} Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.137154 4856 scope.go:117] "RemoveContainer" containerID="7ede2af770aa7660a0b1ba7d50f4622ef7d8f2f6c4c6d3481c5395c80c2a35bb" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.267457 4856 generic.go:334] "Generic (PLEG): container finished" podID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerID="60a3d87fd01033f0a3d0e16a5be5e59a0968d15014710c0aa23ea179037d5fcc" exitCode=0 Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.267499 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerDied","Data":"60a3d87fd01033f0a3d0e16a5be5e59a0968d15014710c0aa23ea179037d5fcc"} Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.766032 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896378 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66vpz\" (UniqueName: \"kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz\") pod \"31e7f358-22b0-4bdb-a685-f0009192fd33\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896483 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data\") pod \"31e7f358-22b0-4bdb-a685-f0009192fd33\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896520 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs\") pod \"31e7f358-22b0-4bdb-a685-f0009192fd33\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896818 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle\") pod \"31e7f358-22b0-4bdb-a685-f0009192fd33\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896909 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs\") pod \"31e7f358-22b0-4bdb-a685-f0009192fd33\" (UID: \"31e7f358-22b0-4bdb-a685-f0009192fd33\") " Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.896991 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs" (OuterVolumeSpecName: "logs") pod "31e7f358-22b0-4bdb-a685-f0009192fd33" (UID: "31e7f358-22b0-4bdb-a685-f0009192fd33"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.897642 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31e7f358-22b0-4bdb-a685-f0009192fd33-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.906989 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz" (OuterVolumeSpecName: "kube-api-access-66vpz") pod "31e7f358-22b0-4bdb-a685-f0009192fd33" (UID: "31e7f358-22b0-4bdb-a685-f0009192fd33"). InnerVolumeSpecName "kube-api-access-66vpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.963937 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data" (OuterVolumeSpecName: "config-data") pod "31e7f358-22b0-4bdb-a685-f0009192fd33" (UID: "31e7f358-22b0-4bdb-a685-f0009192fd33"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.964362 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31e7f358-22b0-4bdb-a685-f0009192fd33" (UID: "31e7f358-22b0-4bdb-a685-f0009192fd33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.998865 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "31e7f358-22b0-4bdb-a685-f0009192fd33" (UID: "31e7f358-22b0-4bdb-a685-f0009192fd33"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.999047 4856 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.999066 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66vpz\" (UniqueName: \"kubernetes.io/projected/31e7f358-22b0-4bdb-a685-f0009192fd33-kube-api-access-66vpz\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.999076 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:30 crc kubenswrapper[4856]: I0320 13:48:30.999090 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e7f358-22b0-4bdb-a685-f0009192fd33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.278588 4856 generic.go:334] "Generic (PLEG): container finished" podID="40599482-130e-4649-a75a-9e7cc3891543" containerID="1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" exitCode=0 Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.280451 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40599482-130e-4649-a75a-9e7cc3891543","Type":"ContainerDied","Data":"1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5"} Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.283567 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"31e7f358-22b0-4bdb-a685-f0009192fd33","Type":"ContainerDied","Data":"a703602f20433b902155ebbe0a63e5ed4c1cdeb4813859b3e478e297731be702"} Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.283599 4856 scope.go:117] "RemoveContainer" containerID="60a3d87fd01033f0a3d0e16a5be5e59a0968d15014710c0aa23ea179037d5fcc" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.283719 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.365859 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.387818 4856 scope.go:117] "RemoveContainer" containerID="3c96b0cee2fbdfe4534153f5a985af176373e829a592d6a22caefdab2de4ceef" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.389558 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.427465 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.453170 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457388 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-metadata" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457427 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-metadata" Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457450 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40599482-130e-4649-a75a-9e7cc3891543" containerName="nova-scheduler-scheduler" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457457 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="40599482-130e-4649-a75a-9e7cc3891543" containerName="nova-scheduler-scheduler" Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457482 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="dnsmasq-dns" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457487 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="dnsmasq-dns" Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457503 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="init" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457509 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="init" Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457518 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="667e1b8e-28bc-4227-8b0b-f0195587213f" containerName="nova-manage" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457524 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="667e1b8e-28bc-4227-8b0b-f0195587213f" containerName="nova-manage" Mar 20 13:48:31 crc kubenswrapper[4856]: E0320 13:48:31.457534 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-log" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457540 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-log" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457858 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="667e1b8e-28bc-4227-8b0b-f0195587213f" containerName="nova-manage" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457875 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-log" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457881 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="40599482-130e-4649-a75a-9e7cc3891543" containerName="nova-scheduler-scheduler" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457893 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="204d3eaf-06d1-4c05-a2a9-01d424229125" containerName="dnsmasq-dns" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.457904 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" containerName="nova-metadata-metadata" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.458845 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.460516 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.460898 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.462505 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.511435 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrxvv\" (UniqueName: \"kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv\") pod \"40599482-130e-4649-a75a-9e7cc3891543\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.511492 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle\") pod \"40599482-130e-4649-a75a-9e7cc3891543\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.511540 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data\") pod \"40599482-130e-4649-a75a-9e7cc3891543\" (UID: \"40599482-130e-4649-a75a-9e7cc3891543\") " Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.515188 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv" (OuterVolumeSpecName: "kube-api-access-mrxvv") pod "40599482-130e-4649-a75a-9e7cc3891543" (UID: "40599482-130e-4649-a75a-9e7cc3891543"). InnerVolumeSpecName "kube-api-access-mrxvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.533616 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data" (OuterVolumeSpecName: "config-data") pod "40599482-130e-4649-a75a-9e7cc3891543" (UID: "40599482-130e-4649-a75a-9e7cc3891543"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.534190 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40599482-130e-4649-a75a-9e7cc3891543" (UID: "40599482-130e-4649-a75a-9e7cc3891543"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.613716 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpzfp\" (UniqueName: \"kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.613838 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.613927 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.614010 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.614114 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.614244 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrxvv\" (UniqueName: \"kubernetes.io/projected/40599482-130e-4649-a75a-9e7cc3891543-kube-api-access-mrxvv\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.614333 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.614504 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40599482-130e-4649-a75a-9e7cc3891543-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716062 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716171 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716255 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpzfp\" (UniqueName: \"kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716303 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716352 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.716528 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.720505 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.720581 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.720921 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.735735 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpzfp\" (UniqueName: \"kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp\") pod \"nova-metadata-0\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.778004 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:31 crc kubenswrapper[4856]: I0320 13:48:31.833980 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e7f358-22b0-4bdb-a685-f0009192fd33" path="/var/lib/kubelet/pods/31e7f358-22b0-4bdb-a685-f0009192fd33/volumes" Mar 20 13:48:32 crc kubenswrapper[4856]: W0320 13:48:32.291372 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod467cf6ce_9c87_45d6_9968_4d5372f70cb3.slice/crio-6b2379550dbcddc223c0fb22f39d2469bc815db27958782addc2524e5370a8ad WatchSource:0}: Error finding container 6b2379550dbcddc223c0fb22f39d2469bc815db27958782addc2524e5370a8ad: Status 404 returned error can't find the container with id 6b2379550dbcddc223c0fb22f39d2469bc815db27958782addc2524e5370a8ad Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.292768 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"40599482-130e-4649-a75a-9e7cc3891543","Type":"ContainerDied","Data":"825a11dadd9eb0bd5de6ebc2be9a3377f97a5f723d930b46362ccc16ff68732d"} Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.292794 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.292825 4856 scope.go:117] "RemoveContainer" containerID="1779fa4c66067793b5a562a324bb520157d0cab8e2626efb3e9c6056043a7ba5" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.295306 4856 generic.go:334] "Generic (PLEG): container finished" podID="8c02093c-9927-4e80-9077-f6a0390c6721" containerID="dfcea8e8e09281bcfc96738b570f941e3374939aa32a47165852731b6ed71dea" exitCode=0 Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.295348 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerDied","Data":"dfcea8e8e09281bcfc96738b570f941e3374939aa32a47165852731b6ed71dea"} Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.295399 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8c02093c-9927-4e80-9077-f6a0390c6721","Type":"ContainerDied","Data":"4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e"} Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.295415 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a3aa1e800ea821322c7a5abfe9513ddecf7e7a49079beb899cddb319cc4468e" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.295431 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.425895 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.440101 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.449324 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.469608 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:32 crc kubenswrapper[4856]: E0320 13:48:32.470154 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-log" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.470174 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-log" Mar 20 13:48:32 crc kubenswrapper[4856]: E0320 13:48:32.470203 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-api" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.470211 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-api" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.470456 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-log" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.470479 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" containerName="nova-api-api" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.471217 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.475010 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.517915 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.550971 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551051 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2vh8\" (UniqueName: \"kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551127 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551175 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551214 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551337 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs\") pod \"8c02093c-9927-4e80-9077-f6a0390c6721\" (UID: \"8c02093c-9927-4e80-9077-f6a0390c6721\") " Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551621 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs" (OuterVolumeSpecName: "logs") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551632 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551787 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551819 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z56wq\" (UniqueName: \"kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.551906 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c02093c-9927-4e80-9077-f6a0390c6721-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.554558 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8" (OuterVolumeSpecName: "kube-api-access-w2vh8") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "kube-api-access-w2vh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.576050 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data" (OuterVolumeSpecName: "config-data") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.578450 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.599537 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.601871 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c02093c-9927-4e80-9077-f6a0390c6721" (UID: "8c02093c-9927-4e80-9077-f6a0390c6721"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.653962 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654010 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z56wq\" (UniqueName: \"kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654109 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654367 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654379 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654387 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654395 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c02093c-9927-4e80-9077-f6a0390c6721-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.654404 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2vh8\" (UniqueName: \"kubernetes.io/projected/8c02093c-9927-4e80-9077-f6a0390c6721-kube-api-access-w2vh8\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.660518 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.661309 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.674336 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z56wq\" (UniqueName: \"kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq\") pod \"nova-scheduler-0\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:32 crc kubenswrapper[4856]: I0320 13:48:32.794504 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.311838 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerStarted","Data":"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3"} Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.312248 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerStarted","Data":"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4"} Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.312403 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerStarted","Data":"6b2379550dbcddc223c0fb22f39d2469bc815db27958782addc2524e5370a8ad"} Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.314010 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.329425 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.339635 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.339616117 podStartE2EDuration="2.339616117s" podCreationTimestamp="2026-03-20 13:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:33.332599948 +0000 UTC m=+1528.213626088" watchObservedRunningTime="2026-03-20 13:48:33.339616117 +0000 UTC m=+1528.220642247" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.494055 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.505735 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.518531 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.520467 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.523724 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.523762 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.523826 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.535851 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673089 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673162 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc9bw\" (UniqueName: \"kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673194 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673521 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673638 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.673733 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775329 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775396 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775450 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775479 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc9bw\" (UniqueName: \"kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775507 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.775571 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.776417 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.780029 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.787529 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.787975 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.788240 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.791734 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc9bw\" (UniqueName: \"kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw\") pod \"nova-api-0\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.839497 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.846355 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40599482-130e-4649-a75a-9e7cc3891543" path="/var/lib/kubelet/pods/40599482-130e-4649-a75a-9e7cc3891543/volumes" Mar 20 13:48:33 crc kubenswrapper[4856]: I0320 13:48:33.847073 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c02093c-9927-4e80-9077-f6a0390c6721" path="/var/lib/kubelet/pods/8c02093c-9927-4e80-9077-f6a0390c6721/volumes" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.163942 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.166435 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.173854 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.285410 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.285484 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.285531 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5dl\" (UniqueName: \"kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.329195 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3","Type":"ContainerStarted","Data":"ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e"} Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.329237 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3","Type":"ContainerStarted","Data":"024d926d8f991c82784d31c13f408ac76c78c0baadf9d7d4ad6f019bdc0747c0"} Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.333123 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.347920 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.347899398 podStartE2EDuration="2.347899398s" podCreationTimestamp="2026-03-20 13:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:34.343558115 +0000 UTC m=+1529.224584255" watchObservedRunningTime="2026-03-20 13:48:34.347899398 +0000 UTC m=+1529.228925548" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.387149 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.387335 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.387460 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5dl\" (UniqueName: \"kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.388359 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.388604 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.409880 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5dl\" (UniqueName: \"kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl\") pod \"redhat-operators-pwj87\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.494744 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:34 crc kubenswrapper[4856]: I0320 13:48:34.927039 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:48:34 crc kubenswrapper[4856]: W0320 13:48:34.933361 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa13fc0_12fb_4af7_b78e_8164223507b3.slice/crio-39d9b9a5cef0abe6a69e64ef39bad97b26bcb27733498a1e18caa2089ff57f6c WatchSource:0}: Error finding container 39d9b9a5cef0abe6a69e64ef39bad97b26bcb27733498a1e18caa2089ff57f6c: Status 404 returned error can't find the container with id 39d9b9a5cef0abe6a69e64ef39bad97b26bcb27733498a1e18caa2089ff57f6c Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.337014 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerStarted","Data":"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6"} Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.337248 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerStarted","Data":"39d9b9a5cef0abe6a69e64ef39bad97b26bcb27733498a1e18caa2089ff57f6c"} Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.343288 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerStarted","Data":"77b452f8fbc35a66da9a652896b32275d2353cdfdc4f3d95fb386acc911a6b89"} Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.343347 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerStarted","Data":"fdc9d6121689074032de5b8e98217043a6aac903a3a4183cd06f9641245a15c5"} Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.343361 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerStarted","Data":"6c08bf7ff547654111137122cc764a0291452db62e24c1bcbe531b1b0fa556a1"} Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.380148 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.380131138 podStartE2EDuration="2.380131138s" podCreationTimestamp="2026-03-20 13:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:35.377477713 +0000 UTC m=+1530.258503853" watchObservedRunningTime="2026-03-20 13:48:35.380131138 +0000 UTC m=+1530.261157268" Mar 20 13:48:35 crc kubenswrapper[4856]: I0320 13:48:35.895727 4856 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podcb9ec39c-3fd5-4477-9868-44e5424f9bb3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podcb9ec39c-3fd5-4477-9868-44e5424f9bb3] : Timed out while waiting for systemd to remove kubepods-besteffort-podcb9ec39c_3fd5_4477_9868_44e5424f9bb3.slice" Mar 20 13:48:36 crc kubenswrapper[4856]: I0320 13:48:36.360600 4856 generic.go:334] "Generic (PLEG): container finished" podID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerID="270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6" exitCode=0 Mar 20 13:48:36 crc kubenswrapper[4856]: I0320 13:48:36.361965 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerDied","Data":"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6"} Mar 20 13:48:37 crc kubenswrapper[4856]: I0320 13:48:37.370575 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerStarted","Data":"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3"} Mar 20 13:48:37 crc kubenswrapper[4856]: I0320 13:48:37.795684 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:48:38 crc kubenswrapper[4856]: I0320 13:48:38.419753 4856 generic.go:334] "Generic (PLEG): container finished" podID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerID="3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3" exitCode=0 Mar 20 13:48:38 crc kubenswrapper[4856]: I0320 13:48:38.419825 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerDied","Data":"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3"} Mar 20 13:48:39 crc kubenswrapper[4856]: I0320 13:48:39.432780 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerStarted","Data":"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad"} Mar 20 13:48:39 crc kubenswrapper[4856]: I0320 13:48:39.458025 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pwj87" podStartSLOduration=2.992371527 podStartE2EDuration="5.45800748s" podCreationTimestamp="2026-03-20 13:48:34 +0000 UTC" firstStartedPulling="2026-03-20 13:48:36.368439134 +0000 UTC m=+1531.249465264" lastFinishedPulling="2026-03-20 13:48:38.834075077 +0000 UTC m=+1533.715101217" observedRunningTime="2026-03-20 13:48:39.44777093 +0000 UTC m=+1534.328797060" watchObservedRunningTime="2026-03-20 13:48:39.45800748 +0000 UTC m=+1534.339033610" Mar 20 13:48:39 crc kubenswrapper[4856]: I0320 13:48:39.987650 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:39 crc kubenswrapper[4856]: I0320 13:48:39.987927 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:41 crc kubenswrapper[4856]: I0320 13:48:41.779126 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:48:41 crc kubenswrapper[4856]: I0320 13:48:41.780416 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:48:42 crc kubenswrapper[4856]: I0320 13:48:42.793454 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:42 crc kubenswrapper[4856]: I0320 13:48:42.793557 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:42 crc kubenswrapper[4856]: I0320 13:48:42.795190 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:48:42 crc kubenswrapper[4856]: I0320 13:48:42.829104 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:48:42 crc kubenswrapper[4856]: I0320 13:48:42.840531 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:48:43 crc kubenswrapper[4856]: I0320 13:48:43.504927 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:48:43 crc kubenswrapper[4856]: I0320 13:48:43.841065 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:48:43 crc kubenswrapper[4856]: I0320 13:48:43.841105 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:48:44 crc kubenswrapper[4856]: I0320 13:48:44.494906 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:44 crc kubenswrapper[4856]: I0320 13:48:44.495269 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:48:44 crc kubenswrapper[4856]: I0320 13:48:44.855459 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:44 crc kubenswrapper[4856]: I0320 13:48:44.855495 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:48:45 crc kubenswrapper[4856]: I0320 13:48:45.547734 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pwj87" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" probeResult="failure" output=< Mar 20 13:48:45 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:48:45 crc kubenswrapper[4856]: > Mar 20 13:48:49 crc kubenswrapper[4856]: I0320 13:48:49.778471 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:48:49 crc kubenswrapper[4856]: I0320 13:48:49.778790 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.784572 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.784644 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.793694 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.793797 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.843167 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:51 crc kubenswrapper[4856]: I0320 13:48:51.843205 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:53 crc kubenswrapper[4856]: I0320 13:48:53.850846 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:48:53 crc kubenswrapper[4856]: I0320 13:48:53.853073 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:48:53 crc kubenswrapper[4856]: I0320 13:48:53.858727 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:48:54 crc kubenswrapper[4856]: I0320 13:48:54.582165 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:48:55 crc kubenswrapper[4856]: I0320 13:48:55.544915 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pwj87" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" probeResult="failure" output=< Mar 20 13:48:55 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:48:55 crc kubenswrapper[4856]: > Mar 20 13:49:04 crc kubenswrapper[4856]: I0320 13:49:04.582642 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:49:04 crc kubenswrapper[4856]: I0320 13:49:04.640405 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:49:05 crc kubenswrapper[4856]: I0320 13:49:05.359932 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:49:05 crc kubenswrapper[4856]: I0320 13:49:05.687994 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pwj87" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" containerID="cri-o://a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad" gracePeriod=2 Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.157723 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.209664 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f5dl\" (UniqueName: \"kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl\") pod \"3fa13fc0-12fb-4af7-b78e-8164223507b3\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.209740 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities\") pod \"3fa13fc0-12fb-4af7-b78e-8164223507b3\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.209890 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content\") pod \"3fa13fc0-12fb-4af7-b78e-8164223507b3\" (UID: \"3fa13fc0-12fb-4af7-b78e-8164223507b3\") " Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.210609 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities" (OuterVolumeSpecName: "utilities") pod "3fa13fc0-12fb-4af7-b78e-8164223507b3" (UID: "3fa13fc0-12fb-4af7-b78e-8164223507b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.217504 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl" (OuterVolumeSpecName: "kube-api-access-6f5dl") pod "3fa13fc0-12fb-4af7-b78e-8164223507b3" (UID: "3fa13fc0-12fb-4af7-b78e-8164223507b3"). InnerVolumeSpecName "kube-api-access-6f5dl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.312651 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f5dl\" (UniqueName: \"kubernetes.io/projected/3fa13fc0-12fb-4af7-b78e-8164223507b3-kube-api-access-6f5dl\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.312712 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.333191 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fa13fc0-12fb-4af7-b78e-8164223507b3" (UID: "3fa13fc0-12fb-4af7-b78e-8164223507b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.415360 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fa13fc0-12fb-4af7-b78e-8164223507b3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.698393 4856 generic.go:334] "Generic (PLEG): container finished" podID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerID="a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad" exitCode=0 Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.698434 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerDied","Data":"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad"} Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.698443 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pwj87" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.698462 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pwj87" event={"ID":"3fa13fc0-12fb-4af7-b78e-8164223507b3","Type":"ContainerDied","Data":"39d9b9a5cef0abe6a69e64ef39bad97b26bcb27733498a1e18caa2089ff57f6c"} Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.698484 4856 scope.go:117] "RemoveContainer" containerID="a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.720683 4856 scope.go:117] "RemoveContainer" containerID="3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.743373 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.752466 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pwj87"] Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.762294 4856 scope.go:117] "RemoveContainer" containerID="270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.817687 4856 scope.go:117] "RemoveContainer" containerID="a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad" Mar 20 13:49:06 crc kubenswrapper[4856]: E0320 13:49:06.818025 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad\": container with ID starting with a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad not found: ID does not exist" containerID="a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.818054 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad"} err="failed to get container status \"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad\": rpc error: code = NotFound desc = could not find container \"a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad\": container with ID starting with a80f7dd6d95f485e38e1f9a99bf55c92e72934711c040b5eec442dcdf65538ad not found: ID does not exist" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.818073 4856 scope.go:117] "RemoveContainer" containerID="3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3" Mar 20 13:49:06 crc kubenswrapper[4856]: E0320 13:49:06.818564 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3\": container with ID starting with 3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3 not found: ID does not exist" containerID="3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.818619 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3"} err="failed to get container status \"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3\": rpc error: code = NotFound desc = could not find container \"3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3\": container with ID starting with 3175682c43a15bf32235ac2a657a3c232b35246dff4f789568f5e48336b489a3 not found: ID does not exist" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.818661 4856 scope.go:117] "RemoveContainer" containerID="270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6" Mar 20 13:49:06 crc kubenswrapper[4856]: E0320 13:49:06.818945 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6\": container with ID starting with 270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6 not found: ID does not exist" containerID="270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6" Mar 20 13:49:06 crc kubenswrapper[4856]: I0320 13:49:06.818965 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6"} err="failed to get container status \"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6\": rpc error: code = NotFound desc = could not find container \"270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6\": container with ID starting with 270102cc07e72a8a1aa7b963e2d2694f05eb67e1fcd406ac5ddeedec6d98bce6 not found: ID does not exist" Mar 20 13:49:07 crc kubenswrapper[4856]: I0320 13:49:07.832896 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" path="/var/lib/kubelet/pods/3fa13fc0-12fb-4af7-b78e-8164223507b3/volumes" Mar 20 13:49:09 crc kubenswrapper[4856]: I0320 13:49:09.987636 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:09 crc kubenswrapper[4856]: I0320 13:49:09.987997 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:09 crc kubenswrapper[4856]: I0320 13:49:09.988040 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:49:09 crc kubenswrapper[4856]: I0320 13:49:09.988731 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:49:09 crc kubenswrapper[4856]: I0320 13:49:09.988787 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585" gracePeriod=600 Mar 20 13:49:10 crc kubenswrapper[4856]: E0320 13:49:10.174115 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51a8789_c529_4a2c_b8f1_dc31a3c06403.slice/crio-8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51a8789_c529_4a2c_b8f1_dc31a3c06403.slice/crio-conmon-8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:49:10 crc kubenswrapper[4856]: I0320 13:49:10.752925 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585" exitCode=0 Mar 20 13:49:10 crc kubenswrapper[4856]: I0320 13:49:10.753011 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585"} Mar 20 13:49:10 crc kubenswrapper[4856]: I0320 13:49:10.753638 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e"} Mar 20 13:49:10 crc kubenswrapper[4856]: I0320 13:49:10.753708 4856 scope.go:117] "RemoveContainer" containerID="b13fe3e7321f46bd5f416f4f0e446ab2a78d2f4517b8f7f4ee6ee00699e34df8" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.454570 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.510817 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.547328 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.547578 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="72c24034-7d59-49d7-b3e2-16d875f99bec" containerName="openstackclient" containerID="cri-o://31ce21f74e86925f967cb789712863c9c0a6a319a2909da14e6791c48d33e2e1" gracePeriod=2 Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.570001 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.673208 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.699888 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882052 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:13 crc kubenswrapper[4856]: E0320 13:49:13.882370 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="extract-content" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882381 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="extract-content" Mar 20 13:49:13 crc kubenswrapper[4856]: E0320 13:49:13.882400 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882406 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" Mar 20 13:49:13 crc kubenswrapper[4856]: E0320 13:49:13.882432 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72c24034-7d59-49d7-b3e2-16d875f99bec" containerName="openstackclient" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882437 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="72c24034-7d59-49d7-b3e2-16d875f99bec" containerName="openstackclient" Mar 20 13:49:13 crc kubenswrapper[4856]: E0320 13:49:13.882449 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="extract-utilities" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882456 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="extract-utilities" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882676 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="72c24034-7d59-49d7-b3e2-16d875f99bec" containerName="openstackclient" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.882690 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa13fc0-12fb-4af7-b78e-8164223507b3" containerName="registry-server" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.883240 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.888330 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.891017 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.891407 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.897782 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="rabbitmq" containerID="cri-o://d2421faa30776943955dd45b063c08f4fd3835932d949de6ba81f54b8fc10d07" gracePeriod=30 Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.912701 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.918518 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.937168 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="rabbitmq" containerID="cri-o://c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e" gracePeriod=30 Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.960365 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.961547 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.970788 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mc7m\" (UniqueName: \"kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.971404 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.971573 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz2hj\" (UniqueName: \"kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.971774 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:13 crc kubenswrapper[4856]: I0320 13:49:13.976794 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.002845 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.052355 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.080353 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mc7m\" (UniqueName: \"kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.080460 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.081086 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.081119 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz2hj\" (UniqueName: \"kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.081150 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d4cf\" (UniqueName: \"kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.081206 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.082228 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.082699 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.093331 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-812f-account-create-update-7d2cz"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.134858 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz2hj\" (UniqueName: \"kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj\") pod \"placement-6441-account-create-update-bl9lv\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.139804 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mc7m\" (UniqueName: \"kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m\") pod \"glance-812f-account-create-update-4crpm\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.145099 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-812f-account-create-update-7d2cz"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.153083 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.161195 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.174555 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6441-account-create-update-cktwb"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.183800 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.185282 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.185327 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d4cf\" (UniqueName: \"kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.186091 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.190956 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.224740 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6441-account-create-update-cktwb"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.240400 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d4cf\" (UniqueName: \"kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf\") pod \"root-account-create-update-tvqkz\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.242665 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.247861 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.248176 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="ovn-northd" containerID="cri-o://9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" gracePeriod=30 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.248336 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="openstack-network-exporter" containerID="cri-o://00f17ef2ce0ea5d744a6efd0758584d67cfece02bc7267d434c74cf910e4020f" gracePeriod=30 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.262770 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.294480 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8hhp\" (UniqueName: \"kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.294592 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.297435 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-n4czn"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.307428 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.323620 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-27vk4"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.338352 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e816-account-create-update-rw6jb"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.354493 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-27vk4"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.358196 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-n4czn"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.369337 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e816-account-create-update-rw6jb"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.385058 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.385333 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-m72lx" podUID="fb213c60-487b-4248-bf86-ed69e2fac5e1" containerName="openstack-network-exporter" containerID="cri-o://13fbe8a6214cf53a695ee8776babc39be8ea87b89ebbab5b990a58da11d28f97" gracePeriod=30 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.397734 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8hhp\" (UniqueName: \"kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.397833 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.398909 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.403603 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.419234 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.474863 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8hhp\" (UniqueName: \"kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp\") pod \"barbican-e816-account-create-update-84wt2\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.474926 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sllh9"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.588483 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sllh9"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.708346 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-76pgk"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.741693 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.766442 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-76pgk"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.852542 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.854641 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.869854 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.888435 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:14 crc kubenswrapper[4856]: E0320 13:49:14.907723 4856 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-c697k" message="Exiting ovn-controller (1) " Mar 20 13:49:14 crc kubenswrapper[4856]: E0320 13:49:14.907758 4856 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-c697k" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" containerID="cri-o://a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.907796 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-c697k" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" containerID="cri-o://a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" gracePeriod=30 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.918149 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-m72lx_fb213c60-487b-4248-bf86-ed69e2fac5e1/openstack-network-exporter/0.log" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.918192 4856 generic.go:334] "Generic (PLEG): container finished" podID="fb213c60-487b-4248-bf86-ed69e2fac5e1" containerID="13fbe8a6214cf53a695ee8776babc39be8ea87b89ebbab5b990a58da11d28f97" exitCode=2 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.918299 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m72lx" event={"ID":"fb213c60-487b-4248-bf86-ed69e2fac5e1","Type":"ContainerDied","Data":"13fbe8a6214cf53a695ee8776babc39be8ea87b89ebbab5b990a58da11d28f97"} Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.928729 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.932974 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.933357 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.933511 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfv56\" (UniqueName: \"kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.940455 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.942397 4856 generic.go:334] "Generic (PLEG): container finished" podID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerID="00f17ef2ce0ea5d744a6efd0758584d67cfece02bc7267d434c74cf910e4020f" exitCode=2 Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.942436 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerDied","Data":"00f17ef2ce0ea5d744a6efd0758584d67cfece02bc7267d434c74cf910e4020f"} Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.959611 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.963228 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.969357 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:49:14 crc kubenswrapper[4856]: I0320 13:49:14.987323 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.008471 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.017820 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-wqlhb"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.035598 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-wqlhb"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.041110 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfv56\" (UniqueName: \"kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.041208 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.041283 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz2tp\" (UniqueName: \"kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.041369 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.042465 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.085765 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfv56\" (UniqueName: \"kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56\") pod \"nova-api-5b9c-account-create-update-cmvbr\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.115693 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fzpqw"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.138719 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fzpqw"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.143294 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz2tp\" (UniqueName: \"kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.143387 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.143411 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv88t\" (UniqueName: \"kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.143479 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.144323 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.157093 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-52xvf"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.181345 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-52xvf"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.187924 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz2tp\" (UniqueName: \"kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp\") pod \"nova-cell0-db61-account-create-update-qs8jw\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.211290 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.211560 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="cinder-scheduler" containerID="cri-o://0dba6d4b780a407897cd32686827c4483d0924be6ffdda04151d3f6aaee1a114" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.212061 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="probe" containerID="cri-o://0537ae0b31b688d61e33527cfef4b3a828f73f588fd203119e3e0fc88c53d392" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.228304 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.244851 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv88t\" (UniqueName: \"kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.244962 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.246483 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.287484 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288110 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-server" containerID="cri-o://99aed41847b9a822596138e8aef2e4873222dfb5643d9cd387d54e1029fa26ae" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288552 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="swift-recon-cron" containerID="cri-o://64b602f06351a958adc8f20a603944b2103c33e0a114f4cd698496a6b2cd9d5a" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288618 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="rsync" containerID="cri-o://2bbe17d4032ceb3620e99676e538a3436047e64cd35c8e408a7e830c8d0c8916" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288660 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-expirer" containerID="cri-o://7b0eeecc01033a001f3ec16d0f85af1f0a2b22608ba9b74a4124f80db63f7023" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288689 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-updater" containerID="cri-o://8dc8373e74ff37fef6751f9af5ae4fe48e9297688f83e021db44786f7698fae2" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288716 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-auditor" containerID="cri-o://3ced5a863f2674ab088b2cfc34623e28ac2f1620c7f6f8dc4f2edb1bd867f7c6" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288751 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-replicator" containerID="cri-o://7324448e1753fd76381dd12b6e7d9dc16d8ab4a8e4930a9aeb6e2de164019847" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288786 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-server" containerID="cri-o://9c977a756055e2886ad5ca74cb43b1715a8e35dc20df5e2db03dadb213f99ae2" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288815 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-updater" containerID="cri-o://c25c88d12f3de7091d79c129a51dbb13814ceb3eb7e0f5f552600e9715f22cd3" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.288857 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-auditor" containerID="cri-o://66ddb19b6cb3cfb47423e82ae3b8ce578f9275b8ddaeaefca9cb0f6db6d03dd4" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.289054 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-replicator" containerID="cri-o://9708b2248759cc0b809d2397329f741db5de5c3791b0c0d67c59ef3236106ae9" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.289100 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-server" containerID="cri-o://336bef9fbe708a542dba755da9664d99b5431f33ebb505b3d410fc05b0726883" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.289131 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-reaper" containerID="cri-o://bf17e5b77e3a7a1bdf3ad21621b736ab3a7b00f1bfc4f9f9e63067c32d46e273" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.289161 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-auditor" containerID="cri-o://7fec7d9d05c7e6a547275f47329f4ae8d58fe68cc7259691ff5842999eacf987" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.289229 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-replicator" containerID="cri-o://103cdf38f9ed46d33180e33da2171b27c42859dedc97518509e187489741b123" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.296189 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv88t\" (UniqueName: \"kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t\") pod \"nova-cell1-9268-account-create-update-lbntw\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.317565 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.341400 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-wg8jz"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.341928 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.417618 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-wg8jz"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.441953 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.461690 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-6hmzq"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.477911 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-6hmzq"] Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.510474 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af is running failed: container process not found" containerID="a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.514157 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.514375 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af is running failed: container process not found" containerID="a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.514610 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api-log" containerID="cri-o://98f8a872358d11589374e68a262e864329cd9ae8329df4f8a4f3f630a5b9881f" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.514764 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api" containerID="cri-o://609ee66b8ffd12f755f76cd9565ffad47cf29b376b4955ce6c3cee2c6b3c9d01" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.526848 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af is running failed: container process not found" containerID="a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" cmd=["/usr/local/bin/container-scripts/ovn_controller_readiness.sh"] Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.526897 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-c697k" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.574441 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-qhrq6"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.616965 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" probeResult="failure" output=< Mar 20 13:49:15 crc kubenswrapper[4856]: cat: /var/run/openvswitch/ovs-vswitchd.pid: No such file or directory Mar 20 13:49:15 crc kubenswrapper[4856]: ERROR - Failed to get pid for ovs-vswitchd, exit status: 0 Mar 20 13:49:15 crc kubenswrapper[4856]: > Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.619701 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" containerID="cri-o://1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" gracePeriod=29 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.627948 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.628724 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="openstack-network-exporter" containerID="cri-o://97772f6ee946ca97d67b8f71fc7f1f0295d3e5e51873e5b35681e4f70ee23a04" gracePeriod=300 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.705770 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.706054 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58f96446cc-blkvz" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-api" containerID="cri-o://22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.706204 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-58f96446cc-blkvz" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-httpd" containerID="cri-o://bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.742405 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-qhrq6"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.783978 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.784667 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="openstack-network-exporter" containerID="cri-o://11061327d008cbbe148b32ba2127eb9d46fde32a3092f69a96fe49fa415abcd1" gracePeriod=300 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.793236 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.793517 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="dnsmasq-dns" containerID="cri-o://de349a9fac18ad1354c9d674041dc5baa22d5b5162beba99ee8edb004fcba1ea" gracePeriod=10 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.832402 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="ovsdbserver-sb" containerID="cri-o://e82da2e027c260fd3e464cbc4c2865412cd62f8b83fa8ce9156736fd50719665" gracePeriod=300 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.857771 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.104:5671: connect: connection refused" Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.858163 4856 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 20 13:49:15 crc kubenswrapper[4856]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 13:49:15 crc kubenswrapper[4856]: + source /usr/local/bin/container-scripts/functions Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNBridge=br-int Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNRemote=tcp:localhost:6642 Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNEncapType=geneve Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNAvailabilityZones= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ EnableChassisAsGateway=true Mar 20 13:49:15 crc kubenswrapper[4856]: ++ PhysicalNetworks= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNHostName= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 13:49:15 crc kubenswrapper[4856]: ++ ovs_dir=/var/lib/openvswitch Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 13:49:15 crc kubenswrapper[4856]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + cleanup_ovsdb_server_semaphore Mar 20 13:49:15 crc kubenswrapper[4856]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 13:49:15 crc kubenswrapper[4856]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-qxlnx" message=< Mar 20 13:49:15 crc kubenswrapper[4856]: Exiting ovsdb-server (5) [ OK ] Mar 20 13:49:15 crc kubenswrapper[4856]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 13:49:15 crc kubenswrapper[4856]: + source /usr/local/bin/container-scripts/functions Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNBridge=br-int Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNRemote=tcp:localhost:6642 Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNEncapType=geneve Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNAvailabilityZones= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ EnableChassisAsGateway=true Mar 20 13:49:15 crc kubenswrapper[4856]: ++ PhysicalNetworks= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNHostName= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 13:49:15 crc kubenswrapper[4856]: ++ ovs_dir=/var/lib/openvswitch Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 13:49:15 crc kubenswrapper[4856]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + cleanup_ovsdb_server_semaphore Mar 20 13:49:15 crc kubenswrapper[4856]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 13:49:15 crc kubenswrapper[4856]: > Mar 20 13:49:15 crc kubenswrapper[4856]: E0320 13:49:15.858198 4856 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 20 13:49:15 crc kubenswrapper[4856]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 20 13:49:15 crc kubenswrapper[4856]: + source /usr/local/bin/container-scripts/functions Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNBridge=br-int Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNRemote=tcp:localhost:6642 Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNEncapType=geneve Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNAvailabilityZones= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ EnableChassisAsGateway=true Mar 20 13:49:15 crc kubenswrapper[4856]: ++ PhysicalNetworks= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ OVNHostName= Mar 20 13:49:15 crc kubenswrapper[4856]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 20 13:49:15 crc kubenswrapper[4856]: ++ ovs_dir=/var/lib/openvswitch Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 20 13:49:15 crc kubenswrapper[4856]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 20 13:49:15 crc kubenswrapper[4856]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + sleep 0.5 Mar 20 13:49:15 crc kubenswrapper[4856]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 20 13:49:15 crc kubenswrapper[4856]: + cleanup_ovsdb_server_semaphore Mar 20 13:49:15 crc kubenswrapper[4856]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 20 13:49:15 crc kubenswrapper[4856]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 20 13:49:15 crc kubenswrapper[4856]: > pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" containerID="cri-o://a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.858227 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" containerID="cri-o://a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" gracePeriod=29 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.874338 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0081bda3-3f0b-4c1e-a7ab-90af9235521f" path="/var/lib/kubelet/pods/0081bda3-3f0b-4c1e-a7ab-90af9235521f/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.875206 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a11a777-2932-4a56-898d-2de11472cbc9" path="/var/lib/kubelet/pods/0a11a777-2932-4a56-898d-2de11472cbc9/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.889029 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b8951ab-81c4-4f8f-8d45-f061c3a397da" path="/var/lib/kubelet/pods/2b8951ab-81c4-4f8f-8d45-f061c3a397da/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.889659 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="394ec9f9-f47c-4f12-af34-26a3953f7668" path="/var/lib/kubelet/pods/394ec9f9-f47c-4f12-af34-26a3953f7668/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.891170 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e59b689-e9d4-460b-8a82-50770f4d4422" path="/var/lib/kubelet/pods/4e59b689-e9d4-460b-8a82-50770f4d4422/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.891697 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64c33778-140d-48df-89fa-ec1719ae6f2d" path="/var/lib/kubelet/pods/64c33778-140d-48df-89fa-ec1719ae6f2d/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.892234 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d40f3b7-d738-43d5-aa70-943d6c2afd59" path="/var/lib/kubelet/pods/6d40f3b7-d738-43d5-aa70-943d6c2afd59/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.906855 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c34367e-1bb1-4e1d-8a11-190bca797f8e" path="/var/lib/kubelet/pods/8c34367e-1bb1-4e1d-8a11-190bca797f8e/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.916667 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="976fa187-ddb7-4116-8476-fb55efdbe660" path="/var/lib/kubelet/pods/976fa187-ddb7-4116-8476-fb55efdbe660/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.918225 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0094c63-b84f-4a1c-839b-47da04da9efb" path="/var/lib/kubelet/pods/a0094c63-b84f-4a1c-839b-47da04da9efb/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.921083 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de5a94fd-f271-4229-be43-98ca4a079573" path="/var/lib/kubelet/pods/de5a94fd-f271-4229-be43-98ca4a079573/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.922181 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39393cf-dda0-4755-8e66-fc571afa2a1a" path="/var/lib/kubelet/pods/e39393cf-dda0-4755-8e66-fc571afa2a1a/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.923903 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f46ba794-6fed-49d3-a0ce-1be8e5a623d4" path="/var/lib/kubelet/pods/f46ba794-6fed-49d3-a0ce-1be8e5a623d4/volumes" Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.924606 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.924633 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-47jfh"] Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.945645 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-log" containerID="cri-o://32c113b6f7715bc1e4450f3f402b1f997eacbd1f27bf889a91c9380bda22c42d" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.946729 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-httpd" containerID="cri-o://5bb29030c4a50eae6ca1db03ef400e510392fd28217af8dc0f5c5c0444dfd46e" gracePeriod=30 Mar 20 13:49:15 crc kubenswrapper[4856]: I0320 13:49:15.974113 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-47jfh"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.010972 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qm47c"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.040046 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="ovsdbserver-nb" containerID="cri-o://e74e7181e50b8482827268fba5857dee7206b3d88d8299a92ee57e8febb4c398" gracePeriod=300 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.069822 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qm47c"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.108833 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.117093 4856 generic.go:334] "Generic (PLEG): container finished" podID="5e2c318a-4df7-4434-8f38-406da145ff89" containerID="a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.117181 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k" event={"ID":"5e2c318a-4df7-4434-8f38-406da145ff89","Type":"ContainerDied","Data":"a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.119712 4856 generic.go:334] "Generic (PLEG): container finished" podID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerID="d2421faa30776943955dd45b063c08f4fd3835932d949de6ba81f54b8fc10d07" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.119776 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerDied","Data":"d2421faa30776943955dd45b063c08f4fd3835932d949de6ba81f54b8fc10d07"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.120342 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.122744 4856 generic.go:334] "Generic (PLEG): container finished" podID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerID="de349a9fac18ad1354c9d674041dc5baa22d5b5162beba99ee8edb004fcba1ea" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.122808 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" event={"ID":"636ad94a-b2ac-42c8-b83d-063d66cfeaf8","Type":"ContainerDied","Data":"de349a9fac18ad1354c9d674041dc5baa22d5b5162beba99ee8edb004fcba1ea"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.125041 4856 generic.go:334] "Generic (PLEG): container finished" podID="4a53fecc-3af1-4ced-acd9-198296d50771" containerID="98f8a872358d11589374e68a262e864329cd9ae8329df4f8a4f3f630a5b9881f" exitCode=143 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.125100 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerDied","Data":"98f8a872358d11589374e68a262e864329cd9ae8329df4f8a4f3f630a5b9881f"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.138356 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.139898 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-httpd" containerID="cri-o://2e30bbf7e5c9ce212c4664a13de9d24567d97e9650bc51aebc00393f15f7368c" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.140322 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-log" containerID="cri-o://eab1a972ced564d26d0363b491849a05a6f3e49a90ed55b28b5329a2b2fb593a" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.166004 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176462 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="2bbe17d4032ceb3620e99676e538a3436047e64cd35c8e408a7e830c8d0c8916" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176530 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="7b0eeecc01033a001f3ec16d0f85af1f0a2b22608ba9b74a4124f80db63f7023" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176544 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="8dc8373e74ff37fef6751f9af5ae4fe48e9297688f83e021db44786f7698fae2" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176553 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="3ced5a863f2674ab088b2cfc34623e28ac2f1620c7f6f8dc4f2edb1bd867f7c6" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176628 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="7324448e1753fd76381dd12b6e7d9dc16d8ab4a8e4930a9aeb6e2de164019847" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176641 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="9c977a756055e2886ad5ca74cb43b1715a8e35dc20df5e2db03dadb213f99ae2" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176650 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="c25c88d12f3de7091d79c129a51dbb13814ceb3eb7e0f5f552600e9715f22cd3" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176660 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="66ddb19b6cb3cfb47423e82ae3b8ce578f9275b8ddaeaefca9cb0f6db6d03dd4" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176670 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="9708b2248759cc0b809d2397329f741db5de5c3791b0c0d67c59ef3236106ae9" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176740 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="336bef9fbe708a542dba755da9664d99b5431f33ebb505b3d410fc05b0726883" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176758 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="bf17e5b77e3a7a1bdf3ad21621b736ab3a7b00f1bfc4f9f9e63067c32d46e273" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176768 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="7fec7d9d05c7e6a547275f47329f4ae8d58fe68cc7259691ff5842999eacf987" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176802 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="103cdf38f9ed46d33180e33da2171b27c42859dedc97518509e187489741b123" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.176813 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="99aed41847b9a822596138e8aef2e4873222dfb5643d9cd387d54e1029fa26ae" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177193 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"2bbe17d4032ceb3620e99676e538a3436047e64cd35c8e408a7e830c8d0c8916"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177575 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"7b0eeecc01033a001f3ec16d0f85af1f0a2b22608ba9b74a4124f80db63f7023"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177628 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"8dc8373e74ff37fef6751f9af5ae4fe48e9297688f83e021db44786f7698fae2"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177643 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"3ced5a863f2674ab088b2cfc34623e28ac2f1620c7f6f8dc4f2edb1bd867f7c6"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177656 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"7324448e1753fd76381dd12b6e7d9dc16d8ab4a8e4930a9aeb6e2de164019847"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177671 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"9c977a756055e2886ad5ca74cb43b1715a8e35dc20df5e2db03dadb213f99ae2"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177706 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"c25c88d12f3de7091d79c129a51dbb13814ceb3eb7e0f5f552600e9715f22cd3"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177721 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"66ddb19b6cb3cfb47423e82ae3b8ce578f9275b8ddaeaefca9cb0f6db6d03dd4"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177732 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"9708b2248759cc0b809d2397329f741db5de5c3791b0c0d67c59ef3236106ae9"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177744 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"336bef9fbe708a542dba755da9664d99b5431f33ebb505b3d410fc05b0726883"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177807 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"bf17e5b77e3a7a1bdf3ad21621b736ab3a7b00f1bfc4f9f9e63067c32d46e273"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177823 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"7fec7d9d05c7e6a547275f47329f4ae8d58fe68cc7259691ff5842999eacf987"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177835 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"103cdf38f9ed46d33180e33da2171b27c42859dedc97518509e187489741b123"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.177845 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"99aed41847b9a822596138e8aef2e4873222dfb5643d9cd387d54e1029fa26ae"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.192357 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54db87fb-r8q4w" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-log" containerID="cri-o://1851f20ad50d19aed32d3be103e4e1bc3e4b3415498ed43e1a10c91964f72276" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.192547 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-54db87fb-r8q4w" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-api" containerID="cri-o://89e762b7150be959d006f5453a95ec72bc93fd09878cdcfd7e75bf3a0eb48e5a" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.205380 4856 generic.go:334] "Generic (PLEG): container finished" podID="72c24034-7d59-49d7-b3e2-16d875f99bec" containerID="31ce21f74e86925f967cb789712863c9c0a6a319a2909da14e6791c48d33e2e1" exitCode=137 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.217948 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_822c90b2-be4b-4764-95fb-b0fb02a7a90a/ovsdbserver-nb/0.log" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.217994 4856 generic.go:334] "Generic (PLEG): container finished" podID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerID="11061327d008cbbe148b32ba2127eb9d46fde32a3092f69a96fe49fa415abcd1" exitCode=2 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.218067 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerDied","Data":"11061327d008cbbe148b32ba2127eb9d46fde32a3092f69a96fe49fa415abcd1"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.229637 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b1414886-740d-404d-997c-d10dcbfbfc06/ovsdbserver-sb/0.log" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.229686 4856 generic.go:334] "Generic (PLEG): container finished" podID="b1414886-740d-404d-997c-d10dcbfbfc06" containerID="97772f6ee946ca97d67b8f71fc7f1f0295d3e5e51873e5b35681e4f70ee23a04" exitCode=2 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.229701 4856 generic.go:334] "Generic (PLEG): container finished" podID="b1414886-740d-404d-997c-d10dcbfbfc06" containerID="e82da2e027c260fd3e464cbc4c2865412cd62f8b83fa8ce9156736fd50719665" exitCode=143 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.229758 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerDied","Data":"97772f6ee946ca97d67b8f71fc7f1f0295d3e5e51873e5b35681e4f70ee23a04"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.229782 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerDied","Data":"e82da2e027c260fd3e464cbc4c2865412cd62f8b83fa8ce9156736fd50719665"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.231844 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6bw48"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.258523 4856 generic.go:334] "Generic (PLEG): container finished" podID="8e5225f1-7607-4e11-904f-0e40e483d384" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" exitCode=0 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.258566 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerDied","Data":"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a"} Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.289477 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6bw48"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.344259 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-171f-account-create-update-qzwkr"] Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.353144 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:16 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: if [ -n "glance" ]; then Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="glance" Mar 20 13:49:16 crc kubenswrapper[4856]: else Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:16 crc kubenswrapper[4856]: fi Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:16 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:16 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:16 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:16 crc kubenswrapper[4856]: # support updates Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.354746 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:16 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: if [ -n "" ]; then Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="" Mar 20 13:49:16 crc kubenswrapper[4856]: else Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:16 crc kubenswrapper[4856]: fi Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:16 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:16 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:16 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:16 crc kubenswrapper[4856]: # support updates Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.354798 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-812f-account-create-update-4crpm" podUID="83848803-038e-4b5a-b161-9f20a629ae9a" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.356308 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-tvqkz" podUID="bca1680a-2f52-465d-83e2-93fcbf318e19" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.397357 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:16 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: if [ -n "placement" ]; then Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="placement" Mar 20 13:49:16 crc kubenswrapper[4856]: else Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:16 crc kubenswrapper[4856]: fi Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:16 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:16 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:16 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:16 crc kubenswrapper[4856]: # support updates Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.398590 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-6441-account-create-update-bl9lv" podUID="6eaf0274-e5e2-4ece-868f-2e7ef81d0e72" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.400579 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-171f-account-create-update-qzwkr"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.434947 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2mn42"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.494998 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.495571 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-844867ddc-kgprv" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-httpd" containerID="cri-o://c0500e5534d244ad7df38fc881c26294037eedb93e6d08de11d7ab3a2446cade" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.495961 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-844867ddc-kgprv" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-server" containerID="cri-o://1b8689a9cd33b372d58dba21c6bfa7ae8b7939cff4da589d3f1a21739591dc55" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.530345 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fc05-account-create-update-4c8xw"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.552207 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.572372 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2mn42"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.595086 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-m72lx_fb213c60-487b-4248-bf86-ed69e2fac5e1/openstack-network-exporter/0.log" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.595232 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.609807 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-94qws"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618173 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618380 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618439 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618469 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618527 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618570 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfrf2\" (UniqueName: \"kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618613 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618651 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md6ft\" (UniqueName: \"kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618725 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.619078 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle\") pod \"5e2c318a-4df7-4434-8f38-406da145ff89\" (UID: \"5e2c318a-4df7-4434-8f38-406da145ff89\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.619121 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.619153 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.619188 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs\") pod \"fb213c60-487b-4248-bf86-ed69e2fac5e1\" (UID: \"fb213c60-487b-4248-bf86-ed69e2fac5e1\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.620011 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.618280 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.620214 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run" (OuterVolumeSpecName: "var-run") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.620300 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.620526 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.620593 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.621020 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config" (OuterVolumeSpecName: "config") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.621644 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts" (OuterVolumeSpecName: "scripts") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.675357 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft" (OuterVolumeSpecName: "kube-api-access-md6ft") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "kube-api-access-md6ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.676852 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fc05-account-create-update-4c8xw"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.677424 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2" (OuterVolumeSpecName: "kube-api-access-pfrf2") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "kube-api-access-pfrf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.720603 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config\") pod \"72c24034-7d59-49d7-b3e2-16d875f99bec\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.720938 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle\") pod \"72c24034-7d59-49d7-b3e2-16d875f99bec\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.720959 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret\") pod \"72c24034-7d59-49d7-b3e2-16d875f99bec\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721185 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk6nq\" (UniqueName: \"kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq\") pod \"72c24034-7d59-49d7-b3e2-16d875f99bec\" (UID: \"72c24034-7d59-49d7-b3e2-16d875f99bec\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721621 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfrf2\" (UniqueName: \"kubernetes.io/projected/fb213c60-487b-4248-bf86-ed69e2fac5e1-kube-api-access-pfrf2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721640 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md6ft\" (UniqueName: \"kubernetes.io/projected/5e2c318a-4df7-4434-8f38-406da145ff89-kube-api-access-md6ft\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721650 4856 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721659 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb213c60-487b-4248-bf86-ed69e2fac5e1-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721668 4856 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721675 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721684 4856 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5e2c318a-4df7-4434-8f38-406da145ff89-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721691 4856 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fb213c60-487b-4248-bf86-ed69e2fac5e1-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.721699 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5e2c318a-4df7-4434-8f38-406da145ff89-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.732335 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.741785 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.747604 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-94qws"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.755440 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-24c45"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.770411 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-24c45"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.770562 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq" (OuterVolumeSpecName: "kube-api-access-dk6nq") pod "72c24034-7d59-49d7-b3e2-16d875f99bec" (UID: "72c24034-7d59-49d7-b3e2-16d875f99bec"). InnerVolumeSpecName "kube-api-access-dk6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.782179 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.786096 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.786884 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72c24034-7d59-49d7-b3e2-16d875f99bec" (UID: "72c24034-7d59-49d7-b3e2-16d875f99bec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.787033 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5e2c318a-4df7-4434-8f38-406da145ff89" (UID: "5e2c318a-4df7-4434-8f38-406da145ff89"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.794147 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "72c24034-7d59-49d7-b3e2-16d875f99bec" (UID: "72c24034-7d59-49d7-b3e2-16d875f99bec"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.831450 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "fb213c60-487b-4248-bf86-ed69e2fac5e1" (UID: "fb213c60-487b-4248-bf86-ed69e2fac5e1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.843479 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:16 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: if [ -n "barbican" ]; then Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="barbican" Mar 20 13:49:16 crc kubenswrapper[4856]: else Mar 20 13:49:16 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:16 crc kubenswrapper[4856]: fi Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:16 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:16 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:16 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:16 crc kubenswrapper[4856]: # support updates Mar 20 13:49:16 crc kubenswrapper[4856]: Mar 20 13:49:16 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.847385 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851350 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851465 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb213c60-487b-4248-bf86-ed69e2fac5e1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851570 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk6nq\" (UniqueName: \"kubernetes.io/projected/72c24034-7d59-49d7-b3e2-16d875f99bec-kube-api-access-dk6nq\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851640 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e2c318a-4df7-4434-8f38-406da145ff89-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851695 4856 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.851754 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: E0320 13:49:16.850056 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-e816-account-create-update-84wt2" podUID="d744a3b5-7023-416f-85cf-62400a452558" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.857996 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-zgj9s"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.894145 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.895312 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-log" containerID="cri-o://fdc9d6121689074032de5b8e98217043a6aac903a3a4183cd06f9641245a15c5" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.895718 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-api" containerID="cri-o://77b452f8fbc35a66da9a652896b32275d2353cdfdc4f3d95fb386acc911a6b89" gracePeriod=30 Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.899151 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "72c24034-7d59-49d7-b3e2-16d875f99bec" (UID: "72c24034-7d59-49d7-b3e2-16d875f99bec"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.905139 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.909699 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-zgj9s"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.912682 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.915994 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953190 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953243 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953330 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqxmt\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953472 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953515 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953540 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953564 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953598 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953626 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953749 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.953779 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd\") pod \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\" (UID: \"0a5438ec-0454-4d8e-b356-f9b87b66c2d7\") " Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.954879 4856 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/72c24034-7d59-49d7-b3e2-16d875f99bec-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.955523 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.957199 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.958324 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.983282 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.983599 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4856]: I0320 13:49:16.990618 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:16.996478 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.000757 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.003116 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt" (OuterVolumeSpecName: "kube-api-access-rqxmt") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "kube-api-access-rqxmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.004177 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data" (OuterVolumeSpecName: "config-data") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.023407 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.032114 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-pjrhc"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.046132 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-pjrhc"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056022 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcfvx\" (UniqueName: \"kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056244 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056286 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056320 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056347 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc\") pod \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\" (UID: \"636ad94a-b2ac-42c8-b83d-063d66cfeaf8\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056732 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqxmt\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-kube-api-access-rqxmt\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056748 4856 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056757 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056764 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056782 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056793 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056801 4856 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056809 4856 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056818 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.056825 4856 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.059713 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.072087 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx" (OuterVolumeSpecName: "kube-api-access-lcfvx") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "kube-api-access-lcfvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.072858 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.080175 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.080502 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-log" containerID="cri-o://70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.081135 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-metadata" containerID="cri-o://bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.086629 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.096678 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bv7js"] Mar 20 13:49:17 crc kubenswrapper[4856]: W0320 13:49:17.101694 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbf696bc_ec01_4d4d_a1c7_b3b6a5d262a6.slice/crio-9be6df78dfaf65080d7bdb1d6ebc0165ac757fbc03c10c637ba5c46722bf65e1 WatchSource:0}: Error finding container 9be6df78dfaf65080d7bdb1d6ebc0165ac757fbc03c10c637ba5c46722bf65e1: Status 404 returned error can't find the container with id 9be6df78dfaf65080d7bdb1d6ebc0165ac757fbc03c10c637ba5c46722bf65e1 Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.130940 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "nova_api" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="nova_api" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.132637 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-api-db-secret\\\" not found\"" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" podUID="dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.162096 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcfvx\" (UniqueName: \"kubernetes.io/projected/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-kube-api-access-lcfvx\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.162396 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bv7js"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.216015 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.238708 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-99lcm"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.278516 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.279043 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-99lcm"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.299953 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.306409 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.306631 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-56d6d658ff-ch8jp" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker-log" containerID="cri-o://d930765c8702d78cbe6f1e7514f35eb8ca4969feb1ad881999f4f96ab179ba9c" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.306767 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-56d6d658ff-ch8jp" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker" containerID="cri-o://981b8b827dab20c49a4b95a3ff976be714d7cf0a7f3e2f70b0c810a4c1492d48" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.312991 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.313241 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener-log" containerID="cri-o://53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.313928 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener" containerID="cri-o://0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.331021 4856 generic.go:334] "Generic (PLEG): container finished" podID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerID="fdc9d6121689074032de5b8e98217043a6aac903a3a4183cd06f9641245a15c5" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.331149 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.331175 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerDied","Data":"fdc9d6121689074032de5b8e98217043a6aac903a3a4183cd06f9641245a15c5"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.331405 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c47c6db4b-7s8m7" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api-log" containerID="cri-o://39d74692a61d64ac3b4733a0a061eddb4db4c4aede15d1da75d20e1e1dc827fa" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.331500 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6c47c6db4b-7s8m7" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api" containerID="cri-o://a9aa0ace58f4d55b503d781dd43d4983ad8cbc31b36675ba56fa2bddfba479db" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.344610 4856 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-tvqkz" secret="" err="secret \"galera-openstack-cell1-dockercfg-dr8n4\" not found" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.344995 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tvqkz" event={"ID":"bca1680a-2f52-465d-83e2-93fcbf318e19","Type":"ContainerStarted","Data":"f1fe12659f7f3ba5a58b2518fb6116e39e5600c51db6e53425afeb9e8269f917"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.350091 4856 generic.go:334] "Generic (PLEG): container finished" podID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerID="70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.350136 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerDied","Data":"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.362856 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-812f-account-create-update-4crpm" event={"ID":"83848803-038e-4b5a-b161-9f20a629ae9a","Type":"ContainerStarted","Data":"f3a4edae93642c1b8ab80ce0a7dbbb635d1751a55b5e9f307d5cfb54976e7efb"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.366985 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.367061 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.367252 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="00a8314e-4faf-4926-82f2-35c25154a7b5" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.367656 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "glance" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="glance" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.368874 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-tvqkz" podUID="bca1680a-2f52-465d-83e2-93fcbf318e19" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.368938 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-812f-account-create-update-4crpm" podUID="83848803-038e-4b5a-b161-9f20a629ae9a" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.378065 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.378529 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.379664 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-bl9lv" event={"ID":"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72","Type":"ContainerStarted","Data":"5af38e17ef595e03b168b32814e4ef3a88f3cbda06209ce1471a404be0c6697f"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.380898 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.381229 4856 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.381391 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts podName:bca1680a-2f52-465d-83e2-93fcbf318e19 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:17.881260152 +0000 UTC m=+1572.762286282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts") pod "root-account-create-update-tvqkz" (UID: "bca1680a-2f52-465d-83e2-93fcbf318e19") : configmap "openstack-cell1-scripts" not found Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.382504 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" event={"ID":"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6","Type":"ContainerStarted","Data":"9be6df78dfaf65080d7bdb1d6ebc0165ac757fbc03c10c637ba5c46722bf65e1"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.387148 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e816-account-create-update-84wt2" event={"ID":"d744a3b5-7023-416f-85cf-62400a452558","Type":"ContainerStarted","Data":"d12cd78e36f925abdae6e61b569af47f675a1515828b07a75fc751de0bf429db"} Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.389283 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "barbican" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="barbican" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.389943 4856 generic.go:334] "Generic (PLEG): container finished" podID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerID="bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.389965 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerDied","Data":"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53"} Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.394697 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-e816-account-create-update-84wt2" podUID="d744a3b5-7023-416f-85cf-62400a452558" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.402307 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c697k" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.402428 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c697k" event={"ID":"5e2c318a-4df7-4434-8f38-406da145ff89","Type":"ContainerDied","Data":"8697f45617c9efd5e62dc6320f06cd2627911323c81a6628222d5424e26800c6"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.402500 4856 scope.go:117] "RemoveContainer" containerID="a3f4426c2cc089b9a22d431ef84ee797b1d4d7d068e5271a43bc7d364770b5af" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.403547 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "placement" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="placement" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.413053 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"placement-db-secret\\\" not found\"" pod="openstack/placement-6441-account-create-update-bl9lv" podUID="6eaf0274-e5e2-4ece-868f-2e7ef81d0e72" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.414474 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_822c90b2-be4b-4764-95fb-b0fb02a7a90a/ovsdbserver-nb/0.log" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.414592 4856 generic.go:334] "Generic (PLEG): container finished" podID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerID="e74e7181e50b8482827268fba5857dee7206b3d88d8299a92ee57e8febb4c398" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.415332 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerDied","Data":"e74e7181e50b8482827268fba5857dee7206b3d88d8299a92ee57e8febb4c398"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.416584 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"822c90b2-be4b-4764-95fb-b0fb02a7a90a","Type":"ContainerDied","Data":"d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.416661 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d37cd1d79b85f7f44aed2d88a471f6bd808f564cf79b61234cbbbf59d2f396ce" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.429375 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.431255 4856 generic.go:334] "Generic (PLEG): container finished" podID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerID="0537ae0b31b688d61e33527cfef4b3a828f73f588fd203119e3e0fc88c53d392" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.431386 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerDied","Data":"0537ae0b31b688d61e33527cfef4b3a828f73f588fd203119e3e0fc88c53d392"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.436744 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.447960 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config" (OuterVolumeSpecName: "config") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.464326 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.464552 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.467672 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.486762 4856 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.487050 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.482405 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dms79"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.490527 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="galera" containerID="cri-o://08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.496311 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dms79"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.497973 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-m72lx_fb213c60-487b-4248-bf86-ed69e2fac5e1/openstack-network-exporter/0.log" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.498259 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-m72lx" event={"ID":"fb213c60-487b-4248-bf86-ed69e2fac5e1","Type":"ContainerDied","Data":"401ad87433ebce0dd685dab4fde17359315208139163df09b85629d7729cf06c"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.498384 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-m72lx" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.516024 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a5438ec-0454-4d8e-b356-f9b87b66c2d7","Type":"ContainerDied","Data":"164e3875551f0c6e8999e3b06c1a19b93e4ddd6e550c2b7f47c7041ccdd268ec"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.516167 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.517659 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2ngth"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.518422 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a5438ec-0454-4d8e-b356-f9b87b66c2d7" (UID: "0a5438ec-0454-4d8e-b356-f9b87b66c2d7"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.533143 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.544559 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.544799 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerName="nova-cell1-conductor-conductor" containerID="cri-o://9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.556878 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "636ad94a-b2ac-42c8-b83d-063d66cfeaf8" (UID: "636ad94a-b2ac-42c8-b83d-063d66cfeaf8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.560344 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2ngth"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.577439 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.577676 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" containerID="cri-o://ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" gracePeriod=30 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.587734 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" event={"ID":"636ad94a-b2ac-42c8-b83d-063d66cfeaf8","Type":"ContainerDied","Data":"14a75f74f0b2d3502e4b9ac703c52a1b8bafd4d32171f2d5f26b9fc3cd75bff1"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.587846 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-4wp22" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.588302 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:17 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: if [ -n "nova_cell1" ]; then Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="nova_cell1" Mar 20 13:49:17 crc kubenswrapper[4856]: else Mar 20 13:49:17 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:17 crc kubenswrapper[4856]: fi Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:17 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:17 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:17 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:17 crc kubenswrapper[4856]: # support updates Mar 20 13:49:17 crc kubenswrapper[4856]: Mar 20 13:49:17 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.591469 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-9268-account-create-update-lbntw" podUID="ef92f699-7db4-4425-949a-693de8e803a3" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.598048 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a5438ec-0454-4d8e-b356-f9b87b66c2d7-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.598072 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.598083 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/636ad94a-b2ac-42c8-b83d-063d66cfeaf8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.602682 4856 generic.go:334] "Generic (PLEG): container finished" podID="9010796b-5362-4885-8a2c-19668efe6e25" containerID="32c113b6f7715bc1e4450f3f402b1f997eacbd1f27bf889a91c9380bda22c42d" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.602750 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerDied","Data":"32c113b6f7715bc1e4450f3f402b1f997eacbd1f27bf889a91c9380bda22c42d"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.607205 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.620775 4856 generic.go:334] "Generic (PLEG): container finished" podID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerID="eab1a972ced564d26d0363b491849a05a6f3e49a90ed55b28b5329a2b2fb593a" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.620827 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerDied","Data":"eab1a972ced564d26d0363b491849a05a6f3e49a90ed55b28b5329a2b2fb593a"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.626758 4856 generic.go:334] "Generic (PLEG): container finished" podID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerID="1851f20ad50d19aed32d3be103e4e1bc3e4b3415498ed43e1a10c91964f72276" exitCode=143 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.626834 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerDied","Data":"1851f20ad50d19aed32d3be103e4e1bc3e4b3415498ed43e1a10c91964f72276"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.630962 4856 generic.go:334] "Generic (PLEG): container finished" podID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerID="1b8689a9cd33b372d58dba21c6bfa7ae8b7939cff4da589d3f1a21739591dc55" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.630992 4856 generic.go:334] "Generic (PLEG): container finished" podID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerID="c0500e5534d244ad7df38fc881c26294037eedb93e6d08de11d7ab3a2446cade" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.631019 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerDied","Data":"1b8689a9cd33b372d58dba21c6bfa7ae8b7939cff4da589d3f1a21739591dc55"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.631048 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerDied","Data":"c0500e5534d244ad7df38fc881c26294037eedb93e6d08de11d7ab3a2446cade"} Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.644052 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_822c90b2-be4b-4764-95fb-b0fb02a7a90a/ovsdbserver-nb/0.log" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.644213 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.647478 4856 scope.go:117] "RemoveContainer" containerID="31ce21f74e86925f967cb789712863c9c0a6a319a2909da14e6791c48d33e2e1" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.673430 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.679030 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.684171 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b1414886-740d-404d-997c-d10dcbfbfc06/ovsdbserver-sb/0.log" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.684236 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700204 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700547 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700585 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700631 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2f57\" (UniqueName: \"kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700659 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700696 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700780 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.700824 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\" (UID: \"822c90b2-be4b-4764-95fb-b0fb02a7a90a\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.702160 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c697k"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.702882 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.703365 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts" (OuterVolumeSpecName: "scripts") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.705688 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.712717 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config" (OuterVolumeSpecName: "config") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.713139 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57" (OuterVolumeSpecName: "kube-api-access-v2f57") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "kube-api-access-v2f57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.722178 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.733233 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.736486 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-4wp22"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.748164 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.755901 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-m72lx"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.759477 4856 scope.go:117] "RemoveContainer" containerID="13fbe8a6214cf53a695ee8776babc39be8ea87b89ebbab5b990a58da11d28f97" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.775428 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.785998 4856 scope.go:117] "RemoveContainer" containerID="d2421faa30776943955dd45b063c08f4fd3835932d949de6ba81f54b8fc10d07" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.799798 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802317 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802453 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802488 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802532 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqmw9\" (UniqueName: \"kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802616 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802656 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802766 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.802801 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803221 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803243 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2f57\" (UniqueName: \"kubernetes.io/projected/822c90b2-be4b-4764-95fb-b0fb02a7a90a-kube-api-access-v2f57\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803254 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803288 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803307 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.803415 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822c90b2-be4b-4764-95fb-b0fb02a7a90a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.805950 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.806191 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts" (OuterVolumeSpecName: "scripts") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.806610 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config" (OuterVolumeSpecName: "config") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.809362 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.820422 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.820493 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.823547 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.826190 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9" (OuterVolumeSpecName: "kube-api-access-pqmw9") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "kube-api-access-pqmw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.839016 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.854607 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "822c90b2-be4b-4764-95fb-b0fb02a7a90a" (UID: "822c90b2-be4b-4764-95fb-b0fb02a7a90a"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.859990 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119ab08d-caf7-497f-b11d-fb0d06a34600" path="/var/lib/kubelet/pods/119ab08d-caf7-497f-b11d-fb0d06a34600/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.860777 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131e451e-458e-4955-b559-b0eefb86cf25" path="/var/lib/kubelet/pods/131e451e-458e-4955-b559-b0eefb86cf25/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.861515 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f749db7-7069-4107-90ea-edfc4ea7dc7f" path="/var/lib/kubelet/pods/1f749db7-7069-4107-90ea-edfc4ea7dc7f/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.862634 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e64b873-3eb0-418d-9fe3-08d8cd4ea6de" path="/var/lib/kubelet/pods/2e64b873-3eb0-418d-9fe3-08d8cd4ea6de/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.863994 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac312f2-a405-42c5-980c-2791676ef7e0" path="/var/lib/kubelet/pods/5ac312f2-a405-42c5-980c-2791676ef7e0/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.864849 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" path="/var/lib/kubelet/pods/5e2c318a-4df7-4434-8f38-406da145ff89/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.868710 4856 scope.go:117] "RemoveContainer" containerID="5ab9a98d399750f4cde1f783ae91e1d160bff7a9ccceea461dbfd346e6e256f6" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.875339 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0" path="/var/lib/kubelet/pods/6179fed7-9fa1-44f7-9dfb-fb7fa99b83d0/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.876542 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" path="/var/lib/kubelet/pods/636ad94a-b2ac-42c8-b83d-063d66cfeaf8/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.896300 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="667e1b8e-28bc-4227-8b0b-f0195587213f" path="/var/lib/kubelet/pods/667e1b8e-28bc-4227-8b0b-f0195587213f/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.902682 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c197353-35fa-478c-816b-c85320d3af70" path="/var/lib/kubelet/pods/6c197353-35fa-478c-816b-c85320d3af70/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.903531 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72c24034-7d59-49d7-b3e2-16d875f99bec" path="/var/lib/kubelet/pods/72c24034-7d59-49d7-b3e2-16d875f99bec/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.904303 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="885ead36-c4fe-42e5-8d15-95d1115cfcf4" path="/var/lib/kubelet/pods/885ead36-c4fe-42e5-8d15-95d1115cfcf4/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.911487 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94f62824-29e7-4d19-afd3-43ddb0867654" path="/var/lib/kubelet/pods/94f62824-29e7-4d19-afd3-43ddb0867654/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.912434 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.922935 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac73110f-ec10-4b3b-9a7c-02a43ce9cef4" path="/var/lib/kubelet/pods/ac73110f-ec10-4b3b-9a7c-02a43ce9cef4/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.938867 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.938956 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.938967 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e928ca03-85a4-4e75-bfc2-6752d35d34ab" path="/var/lib/kubelet/pods/e928ca03-85a4-4e75-bfc2-6752d35d34ab/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.940217 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.938979 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.940835 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") pod \"b1414886-740d-404d-997c-d10dcbfbfc06\" (UID: \"b1414886-740d-404d-997c-d10dcbfbfc06\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.940902 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.940933 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqr2d\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.940987 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.941063 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.941217 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.941324 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs\") pod \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\" (UID: \"65f0a0ca-c150-4773-a368-b3fb2dadfeb2\") " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942507 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942525 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942534 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942547 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942568 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942582 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1414886-740d-404d-997c-d10dcbfbfc06-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942591 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqmw9\" (UniqueName: \"kubernetes.io/projected/b1414886-740d-404d-997c-d10dcbfbfc06-kube-api-access-pqmw9\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942600 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.942609 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/822c90b2-be4b-4764-95fb-b0fb02a7a90a-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.944049 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: W0320 13:49:17.944984 4856 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/b1414886-740d-404d-997c-d10dcbfbfc06/volumes/kubernetes.io~secret/metrics-certs-tls-certs Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.945015 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.955085 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.957638 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.958985 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.959084 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="ovn-northd" Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.959320 4856 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 13:49:17 crc kubenswrapper[4856]: E0320 13:49:17.959454 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts podName:bca1680a-2f52-465d-83e2-93fcbf318e19 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:18.959392038 +0000 UTC m=+1573.840418168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts") pod "root-account-create-update-tvqkz" (UID: "bca1680a-2f52-465d-83e2-93fcbf318e19") : configmap "openstack-cell1-scripts" not found Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.960693 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d" (OuterVolumeSpecName: "kube-api-access-qqr2d") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "kube-api-access-qqr2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.960802 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5f44803-700d-498a-819c-881f5959b477" path="/var/lib/kubelet/pods/f5f44803-700d-498a-819c-881f5959b477/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.962034 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.964243 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.966533 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7464638-d349-45b6-86af-1a5cd9f7664f" path="/var/lib/kubelet/pods/f7464638-d349-45b6-86af-1a5cd9f7664f/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.967924 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb213c60-487b-4248-bf86-ed69e2fac5e1" path="/var/lib/kubelet/pods/fb213c60-487b-4248-bf86-ed69e2fac5e1/volumes" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.969130 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b1414886-740d-404d-997c-d10dcbfbfc06" (UID: "b1414886-740d-404d-997c-d10dcbfbfc06"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.989463 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.989501 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:49:17 crc kubenswrapper[4856]: I0320 13:49:17.989516 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.004617 4856 scope.go:117] "RemoveContainer" containerID="de349a9fac18ad1354c9d674041dc5baa22d5b5162beba99ee8edb004fcba1ea" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.006311 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:18 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: if [ -n "nova_cell0" ]; then Mar 20 13:49:18 crc kubenswrapper[4856]: GRANT_DATABASE="nova_cell0" Mar 20 13:49:18 crc kubenswrapper[4856]: else Mar 20 13:49:18 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:18 crc kubenswrapper[4856]: fi Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:18 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:18 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:18 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:18 crc kubenswrapper[4856]: # support updates Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.010346 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" podUID="c3a2b3f2-ab65-4013-9aa8-66d38474c2ab" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.011330 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.012513 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051690 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051719 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqr2d\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-kube-api-access-qqr2d\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051729 4856 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051738 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1414886-740d-404d-997c-d10dcbfbfc06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051746 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051755 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051764 4856 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.051858 4856 scope.go:117] "RemoveContainer" containerID="6fb4d0d72db1e2865ac852c989e6934a1228985a148c4cfeae18d2d130395973" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.062694 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.069837 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.074139 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.098302 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data" (OuterVolumeSpecName: "config-data") pod "65f0a0ca-c150-4773-a368-b3fb2dadfeb2" (UID: "65f0a0ca-c150-4773-a368-b3fb2dadfeb2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.153857 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfv56\" (UniqueName: \"kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56\") pod \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.154061 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts\") pod \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\" (UID: \"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.154396 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.154414 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.154423 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.154431 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/65f0a0ca-c150-4773-a368-b3fb2dadfeb2-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.155030 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6" (UID: "dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.162499 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56" (OuterVolumeSpecName: "kube-api-access-dfv56") pod "dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6" (UID: "dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6"). InnerVolumeSpecName "kube-api-access-dfv56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.256338 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.256364 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfv56\" (UniqueName: \"kubernetes.io/projected/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6-kube-api-access-dfv56\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.460168 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.564765 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data\") pod \"00a8314e-4faf-4926-82f2-35c25154a7b5\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.565195 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdcgr\" (UniqueName: \"kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr\") pod \"00a8314e-4faf-4926-82f2-35c25154a7b5\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.565231 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs\") pod \"00a8314e-4faf-4926-82f2-35c25154a7b5\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.565289 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle\") pod \"00a8314e-4faf-4926-82f2-35c25154a7b5\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.565411 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs\") pod \"00a8314e-4faf-4926-82f2-35c25154a7b5\" (UID: \"00a8314e-4faf-4926-82f2-35c25154a7b5\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.573344 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr" (OuterVolumeSpecName: "kube-api-access-rdcgr") pod "00a8314e-4faf-4926-82f2-35c25154a7b5" (UID: "00a8314e-4faf-4926-82f2-35c25154a7b5"). InnerVolumeSpecName "kube-api-access-rdcgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.612484 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.624483 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.624889 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00a8314e-4faf-4926-82f2-35c25154a7b5" (UID: "00a8314e-4faf-4926-82f2-35c25154a7b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.633315 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.644871 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.644933 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.646372 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "00a8314e-4faf-4926-82f2-35c25154a7b5" (UID: "00a8314e-4faf-4926-82f2-35c25154a7b5"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.646458 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data" (OuterVolumeSpecName: "config-data") pod "00a8314e-4faf-4926-82f2-35c25154a7b5" (UID: "00a8314e-4faf-4926-82f2-35c25154a7b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.661824 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "00a8314e-4faf-4926-82f2-35c25154a7b5" (UID: "00a8314e-4faf-4926-82f2-35c25154a7b5"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.668709 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b1414886-740d-404d-997c-d10dcbfbfc06/ovsdbserver-sb/0.log" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.668821 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b1414886-740d-404d-997c-d10dcbfbfc06","Type":"ContainerDied","Data":"9d385442c55dd9888a0b9258eb36d00bf9d9ad75f41ebd2cb3de9c08afb3b344"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.668869 4856 scope.go:117] "RemoveContainer" containerID="97772f6ee946ca97d67b8f71fc7f1f0295d3e5e51873e5b35681e4f70ee23a04" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.668977 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.675469 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.675551 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdcgr\" (UniqueName: \"kubernetes.io/projected/00a8314e-4faf-4926-82f2-35c25154a7b5-kube-api-access-rdcgr\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.675589 4856 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.675623 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.675670 4856 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/00a8314e-4faf-4926-82f2-35c25154a7b5-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697376 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697748 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="ovsdbserver-nb" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697761 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="ovsdbserver-nb" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697774 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-httpd" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697780 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-httpd" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697790 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697796 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697810 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697815 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697824 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="mysql-bootstrap" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697829 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="mysql-bootstrap" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697844 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697852 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697863 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-server" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697870 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-server" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697882 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a8314e-4faf-4926-82f2-35c25154a7b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697889 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a8314e-4faf-4926-82f2-35c25154a7b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697897 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="rabbitmq" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697902 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="rabbitmq" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697914 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="ovsdbserver-sb" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697920 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="ovsdbserver-sb" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697928 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="galera" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697933 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="galera" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697944 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="setup-container" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697949 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="setup-container" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697963 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb213c60-487b-4248-bf86-ed69e2fac5e1" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697969 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb213c60-487b-4248-bf86-ed69e2fac5e1" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697979 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="dnsmasq-dns" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697985 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="dnsmasq-dns" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.697993 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="init" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.697999 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="init" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698148 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a8314e-4faf-4926-82f2-35c25154a7b5" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698161 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb213c60-487b-4248-bf86-ed69e2fac5e1" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698173 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-server" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698182 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="636ad94a-b2ac-42c8-b83d-063d66cfeaf8" containerName="dnsmasq-dns" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698193 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="ovsdbserver-sb" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698202 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698210 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" containerName="proxy-httpd" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698221 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="openstack-network-exporter" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698229 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" containerName="ovsdbserver-nb" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698238 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" containerName="galera" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698247 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e2c318a-4df7-4434-8f38-406da145ff89" containerName="ovn-controller" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698255 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" containerName="rabbitmq" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.698824 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.703710 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.704669 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.714298 4856 generic.go:334] "Generic (PLEG): container finished" podID="00a8314e-4faf-4926-82f2-35c25154a7b5" containerID="54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef" exitCode=0 Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.714360 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a8314e-4faf-4926-82f2-35c25154a7b5","Type":"ContainerDied","Data":"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.714385 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"00a8314e-4faf-4926-82f2-35c25154a7b5","Type":"ContainerDied","Data":"48bab3d6f0381fcb45d9eec3b168847a0ffbdc483bed2994febfd468a2615d0d"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.714439 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.718755 4856 generic.go:334] "Generic (PLEG): container finished" podID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerID="d930765c8702d78cbe6f1e7514f35eb8ca4969feb1ad881999f4f96ab179ba9c" exitCode=143 Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.718809 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerDied","Data":"d930765c8702d78cbe6f1e7514f35eb8ca4969feb1ad881999f4f96ab179ba9c"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.723341 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" event={"ID":"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab","Type":"ContainerStarted","Data":"b97994d2abd832cfb53297d1b107906d1bad7e8b63a231f4113d01a9f7f174a9"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.731541 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.750533 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.766337 4856 scope.go:117] "RemoveContainer" containerID="e82da2e027c260fd3e464cbc4c2865412cd62f8b83fa8ce9156736fd50719665" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.766367 4856 generic.go:334] "Generic (PLEG): container finished" podID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerID="39d74692a61d64ac3b4733a0a061eddb4db4c4aede15d1da75d20e1e1dc827fa" exitCode=143 Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.766398 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerDied","Data":"39d74692a61d64ac3b4733a0a061eddb4db4c4aede15d1da75d20e1e1dc827fa"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.768122 4856 generic.go:334] "Generic (PLEG): container finished" podID="1f98c320-f318-443d-816d-f3dec9784023" containerID="53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd" exitCode=143 Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.768184 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerDied","Data":"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.769462 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-844867ddc-kgprv" event={"ID":"65f0a0ca-c150-4773-a368-b3fb2dadfeb2","Type":"ContainerDied","Data":"d8fd0d9d1348d14868db817e4088cc9ef731c82b6e610757db5f4ebb231384c6"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.769556 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-844867ddc-kgprv" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.775069 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" event={"ID":"dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6","Type":"ContainerDied","Data":"9be6df78dfaf65080d7bdb1d6ebc0165ac757fbc03c10c637ba5c46722bf65e1"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.775377 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5b9c-account-create-update-cmvbr" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776320 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776409 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776452 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9pwm\" (UniqueName: \"kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776478 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776516 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776546 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776605 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.776633 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs\") pod \"64ade26b-4889-4021-b876-1fdbdb077c26\" (UID: \"64ade26b-4889-4021-b876-1fdbdb077c26\") " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.779712 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.779750 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.780188 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.781001 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.788287 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9268-account-create-update-lbntw" event={"ID":"ef92f699-7db4-4425-949a-693de8e803a3","Type":"ContainerStarted","Data":"e1dcd7ac1aae4eb5623dcf2b16f8699150411fb7570d8286a44f51dc145b06cc"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793360 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793365 4856 scope.go:117] "RemoveContainer" containerID="54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793466 4856 generic.go:334] "Generic (PLEG): container finished" podID="64ade26b-4889-4021-b876-1fdbdb077c26" containerID="08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c" exitCode=0 Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793477 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "mysql-db") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793542 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793553 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793589 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerDied","Data":"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.793612 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"64ade26b-4889-4021-b876-1fdbdb077c26","Type":"ContainerDied","Data":"85809beb348a1e815d58f20f7a46b185caf09efd3526b86a930c0cb608df5be1"} Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.794037 4856 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-tvqkz" secret="" err="secret \"galera-openstack-cell1-dockercfg-dr8n4\" not found" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.801384 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:18 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: if [ -n "" ]; then Mar 20 13:49:18 crc kubenswrapper[4856]: GRANT_DATABASE="" Mar 20 13:49:18 crc kubenswrapper[4856]: else Mar 20 13:49:18 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:18 crc kubenswrapper[4856]: fi Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:18 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:18 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:18 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:18 crc kubenswrapper[4856]: # support updates Mar 20 13:49:18 crc kubenswrapper[4856]: Mar 20 13:49:18 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.802650 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-tvqkz" podUID="bca1680a-2f52-465d-83e2-93fcbf318e19" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.813816 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.817450 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm" (OuterVolumeSpecName: "kube-api-access-c9pwm") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "kube-api-access-c9pwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.866042 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884502 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kccnr\" (UniqueName: \"kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884608 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884728 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884742 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9pwm\" (UniqueName: \"kubernetes.io/projected/64ade26b-4889-4021-b876-1fdbdb077c26-kube-api-access-c9pwm\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884773 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884783 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884793 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884802 4856 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/64ade26b-4889-4021-b876-1fdbdb077c26-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.884819 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.891632 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "64ade26b-4889-4021-b876-1fdbdb077c26" (UID: "64ade26b-4889-4021-b876-1fdbdb077c26"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.897350 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.910680 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.915118 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.921324 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.961797 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-844867ddc-kgprv"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.971829 4856 scope.go:117] "RemoveContainer" containerID="54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.973864 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef\": container with ID starting with 54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef not found: ID does not exist" containerID="54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.973909 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef"} err="failed to get container status \"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef\": rpc error: code = NotFound desc = could not find container \"54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef\": container with ID starting with 54796f0a0fae65773424077f71390a18ce72805ca1f683e5d0d406e4eb9c86ef not found: ID does not exist" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.973962 4856 scope.go:117] "RemoveContainer" containerID="1b8689a9cd33b372d58dba21c6bfa7ae8b7939cff4da589d3f1a21739591dc55" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.982521 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.986751 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.986945 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kccnr\" (UniqueName: \"kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.987005 4856 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/64ade26b-4889-4021-b876-1fdbdb077c26-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.987015 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.987817 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5b9c-account-create-update-cmvbr"] Mar 20 13:49:18 crc kubenswrapper[4856]: I0320 13:49:18.987885 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.988766 4856 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 20 13:49:18 crc kubenswrapper[4856]: E0320 13:49:18.988815 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts podName:bca1680a-2f52-465d-83e2-93fcbf318e19 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:20.988801189 +0000 UTC m=+1575.869827309 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts") pod "root-account-create-update-tvqkz" (UID: "bca1680a-2f52-465d-83e2-93fcbf318e19") : configmap "openstack-cell1-scripts" not found Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.003091 4856 scope.go:117] "RemoveContainer" containerID="c0500e5534d244ad7df38fc881c26294037eedb93e6d08de11d7ab3a2446cade" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.008129 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kccnr\" (UniqueName: \"kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr\") pod \"root-account-create-update-k8qzc\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.025789 4856 scope.go:117] "RemoveContainer" containerID="08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.031824 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.043709 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.061027 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.171:8776/healthcheck\": read tcp 10.217.0.2:52492->10.217.0.171:8776: read: connection reset by peer" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.096237 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.096677 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-central-agent" containerID="cri-o://1921c9edd328e3c015ebb3e3c66b2a013cbe2bfbc222ca92d31d2027cab3c79d" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.097369 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="sg-core" containerID="cri-o://8baa4310240d002ce370f1938841d57f264431b93149fa8dcd6abcd5b71b8287" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.097454 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-notification-agent" containerID="cri-o://776beaea3c9377bde7e78faa04866103da0c3428a4e7fd231cab6e96b2a77971" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.097666 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="proxy-httpd" containerID="cri-o://88dd600e0e2eead445603b77f3af33d71b91561cfdbeb4eb78f38a3d301378d1" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.178133 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.178799 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" containerName="kube-state-metrics" containerID="cri-o://b6968ce7333be8a128c62fc951855092f3670f6f1c40b19c0d3cebdfeb3e52d2" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.190176 4856 scope.go:117] "RemoveContainer" containerID="067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.191859 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts\") pod \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.191907 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz2tp\" (UniqueName: \"kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp\") pod \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\" (UID: \"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.193286 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3a2b3f2-ab65-4013-9aa8-66d38474c2ab" (UID: "c3a2b3f2-ab65-4013-9aa8-66d38474c2ab"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.198367 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp" (OuterVolumeSpecName: "kube-api-access-qz2tp") pod "c3a2b3f2-ab65-4013-9aa8-66d38474c2ab" (UID: "c3a2b3f2-ab65-4013-9aa8-66d38474c2ab"). InnerVolumeSpecName "kube-api-access-qz2tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.294745 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.294789 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz2tp\" (UniqueName: \"kubernetes.io/projected/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab-kube-api-access-qz2tp\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.306526 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.306915 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="628ad6bb-ab51-4021-9757-4247a1ccfa71" containerName="memcached" containerID="cri-o://064628587a2ade55cadbdf45c06311c4f3381732bc6b09b5dbd0ce453ed18ef9" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.342245 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4b4e-account-create-update-fk89g"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.364152 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4b4e-account-create-update-fk89g"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.419175 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-4b4e-account-create-update-b95jl"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.428740 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.431118 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.438172 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-4b4e-account-create-update-b95jl"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.451329 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-97mlx"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.460455 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-97mlx"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.461528 4856 scope.go:117] "RemoveContainer" containerID="08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c" Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.462423 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c\": container with ID starting with 08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c not found: ID does not exist" containerID="08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.462457 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c"} err="failed to get container status \"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c\": rpc error: code = NotFound desc = could not find container \"08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c\": container with ID starting with 08b403c08deb3d80ed06d1238b9c868569a28008c1c496a961ef88f6de7ed32c not found: ID does not exist" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.462479 4856 scope.go:117] "RemoveContainer" containerID="067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0" Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.462673 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0\": container with ID starting with 067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0 not found: ID does not exist" containerID="067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.462693 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0"} err="failed to get container status \"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0\": rpc error: code = NotFound desc = could not find container \"067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0\": container with ID starting with 067e177871a02e79c470eb45523c1a98c24f20e9cedf62bc1b91b10b3dbcd9a0 not found: ID does not exist" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.472839 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ck59h"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.481194 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ck59h"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.487451 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.487701 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-7c6b6b7976-vc6rm" podUID="1ac0adc6-d09a-4367-838e-67f78ae5a050" containerName="keystone-api" containerID="cri-o://4eb1f6a354f4bcf7cd6a7759bd2b31a120d7b8a455f854f1604a17e887048c77" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.503791 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.509183 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-rv7l2"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.524720 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-rv7l2"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.576109 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4b4e-account-create-update-b95jl"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.595999 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.604309 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.607046 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.607124 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjn9p\" (UniqueName: \"kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.619077 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.635413 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-sjn9p operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-4b4e-account-create-update-b95jl" podUID="bd4e74c5-7795-4fcd-ba5f-11cd6a05575d" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.656391 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.719082 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.719137 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjn9p\" (UniqueName: \"kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.719522 4856 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.719569 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:20.219553277 +0000 UTC m=+1575.100579417 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : configmap "openstack-scripts" not found Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.724773 4856 projected.go:194] Error preparing data for projected volume kube-api-access-sjn9p for pod openstack/keystone-4b4e-account-create-update-b95jl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 13:49:19 crc kubenswrapper[4856]: E0320 13:49:19.724835 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:20.224816227 +0000 UTC m=+1575.105842357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sjn9p" (UniqueName: "kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.800572 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.806666 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="galera" containerID="cri-o://bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092" gracePeriod=30 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.811568 4856 generic.go:334] "Generic (PLEG): container finished" podID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" containerID="b6968ce7333be8a128c62fc951855092f3670f6f1c40b19c0d3cebdfeb3e52d2" exitCode=2 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.811622 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82ca80b6-bf8a-4741-a5e0-059f20fae69b","Type":"ContainerDied","Data":"b6968ce7333be8a128c62fc951855092f3670f6f1c40b19c0d3cebdfeb3e52d2"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.811684 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.815113 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-e816-account-create-update-84wt2" event={"ID":"d744a3b5-7023-416f-85cf-62400a452558","Type":"ContainerDied","Data":"d12cd78e36f925abdae6e61b569af47f675a1515828b07a75fc751de0bf429db"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.815426 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-e816-account-create-update-84wt2" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.820852 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts\") pod \"ef92f699-7db4-4425-949a-693de8e803a3\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.820933 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv88t\" (UniqueName: \"kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t\") pod \"ef92f699-7db4-4425-949a-693de8e803a3\" (UID: \"ef92f699-7db4-4425-949a-693de8e803a3\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.827939 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t" (OuterVolumeSpecName: "kube-api-access-bv88t") pod "ef92f699-7db4-4425-949a-693de8e803a3" (UID: "ef92f699-7db4-4425-949a-693de8e803a3"). InnerVolumeSpecName "kube-api-access-bv88t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.828127 4856 generic.go:334] "Generic (PLEG): container finished" podID="9010796b-5362-4885-8a2c-19668efe6e25" containerID="5bb29030c4a50eae6ca1db03ef400e510392fd28217af8dc0f5c5c0444dfd46e" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.830690 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00641aa4-4bea-4510-b7a7-a5e1c022340a" path="/var/lib/kubelet/pods/00641aa4-4bea-4510-b7a7-a5e1c022340a/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.831159 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a8314e-4faf-4926-82f2-35c25154a7b5" path="/var/lib/kubelet/pods/00a8314e-4faf-4926-82f2-35c25154a7b5/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.832167 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a5438ec-0454-4d8e-b356-f9b87b66c2d7" path="/var/lib/kubelet/pods/0a5438ec-0454-4d8e-b356-f9b87b66c2d7/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.833118 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef92f699-7db4-4425-949a-693de8e803a3" (UID: "ef92f699-7db4-4425-949a-693de8e803a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.833634 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="623925cd-b615-49a0-be71-08bf412fdc92" path="/var/lib/kubelet/pods/623925cd-b615-49a0-be71-08bf412fdc92/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.834216 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ade26b-4889-4021-b876-1fdbdb077c26" path="/var/lib/kubelet/pods/64ade26b-4889-4021-b876-1fdbdb077c26/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.834795 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f0a0ca-c150-4773-a368-b3fb2dadfeb2" path="/var/lib/kubelet/pods/65f0a0ca-c150-4773-a368-b3fb2dadfeb2/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.835828 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822c90b2-be4b-4764-95fb-b0fb02a7a90a" path="/var/lib/kubelet/pods/822c90b2-be4b-4764-95fb-b0fb02a7a90a/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.836548 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1414886-740d-404d-997c-d10dcbfbfc06" path="/var/lib/kubelet/pods/b1414886-740d-404d-997c-d10dcbfbfc06/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.837925 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b203ab60-5ec5-4897-87bb-915b96c106ed" path="/var/lib/kubelet/pods/b203ab60-5ec5-4897-87bb-915b96c106ed/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.838426 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6" path="/var/lib/kubelet/pods/dbf696bc-ec01-4d4d-a1c7-b3b6a5d262a6/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.838760 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64bb4e5-da79-4c81-b430-1d65747c1d37" path="/var/lib/kubelet/pods/f64bb4e5-da79-4c81-b430-1d65747c1d37/volumes" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.853126 4856 generic.go:334] "Generic (PLEG): container finished" podID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerID="2e30bbf7e5c9ce212c4664a13de9d24567d97e9650bc51aebc00393f15f7368c" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.853252 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.885383 4856 generic.go:334] "Generic (PLEG): container finished" podID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerID="89e762b7150be959d006f5453a95ec72bc93fd09878cdcfd7e75bf3a0eb48e5a" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.897879 4856 generic.go:334] "Generic (PLEG): container finished" podID="4a53fecc-3af1-4ced-acd9-198296d50771" containerID="609ee66b8ffd12f755f76cd9565ffad47cf29b376b4955ce6c3cee2c6b3c9d01" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.905433 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.910020 4856 generic.go:334] "Generic (PLEG): container finished" podID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerID="88dd600e0e2eead445603b77f3af33d71b91561cfdbeb4eb78f38a3d301378d1" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.910063 4856 generic.go:334] "Generic (PLEG): container finished" podID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerID="8baa4310240d002ce370f1938841d57f264431b93149fa8dcd6abcd5b71b8287" exitCode=2 Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.923485 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz2hj\" (UniqueName: \"kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj\") pod \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.923554 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8hhp\" (UniqueName: \"kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp\") pod \"d744a3b5-7023-416f-85cf-62400a452558\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.923602 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts\") pod \"d744a3b5-7023-416f-85cf-62400a452558\" (UID: \"d744a3b5-7023-416f-85cf-62400a452558\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.923662 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts\") pod \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\" (UID: \"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72\") " Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.924058 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv88t\" (UniqueName: \"kubernetes.io/projected/ef92f699-7db4-4425-949a-693de8e803a3-kube-api-access-bv88t\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.924069 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef92f699-7db4-4425-949a-693de8e803a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.925249 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d744a3b5-7023-416f-85cf-62400a452558" (UID: "d744a3b5-7023-416f-85cf-62400a452558"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.930108 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp" (OuterVolumeSpecName: "kube-api-access-m8hhp") pod "d744a3b5-7023-416f-85cf-62400a452558" (UID: "d744a3b5-7023-416f-85cf-62400a452558"). InnerVolumeSpecName "kube-api-access-m8hhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.930331 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6eaf0274-e5e2-4ece-868f-2e7ef81d0e72" (UID: "6eaf0274-e5e2-4ece-868f-2e7ef81d0e72"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.939259 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj" (OuterVolumeSpecName: "kube-api-access-cz2hj") pod "6eaf0274-e5e2-4ece-868f-2e7ef81d0e72" (UID: "6eaf0274-e5e2-4ece-868f-2e7ef81d0e72"). InnerVolumeSpecName "kube-api-access-cz2hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.939680 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.941318 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955564 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerDied","Data":"5bb29030c4a50eae6ca1db03ef400e510392fd28217af8dc0f5c5c0444dfd46e"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955603 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerDied","Data":"2e30bbf7e5c9ce212c4664a13de9d24567d97e9650bc51aebc00393f15f7368c"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955617 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerDied","Data":"89e762b7150be959d006f5453a95ec72bc93fd09878cdcfd7e75bf3a0eb48e5a"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955636 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerDied","Data":"609ee66b8ffd12f755f76cd9565ffad47cf29b376b4955ce6c3cee2c6b3c9d01"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955647 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9268-account-create-update-lbntw" event={"ID":"ef92f699-7db4-4425-949a-693de8e803a3","Type":"ContainerDied","Data":"e1dcd7ac1aae4eb5623dcf2b16f8699150411fb7570d8286a44f51dc145b06cc"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955657 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerDied","Data":"88dd600e0e2eead445603b77f3af33d71b91561cfdbeb4eb78f38a3d301378d1"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955667 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerDied","Data":"8baa4310240d002ce370f1938841d57f264431b93149fa8dcd6abcd5b71b8287"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.955680 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db61-account-create-update-qs8jw" event={"ID":"c3a2b3f2-ab65-4013-9aa8-66d38474c2ab","Type":"ContainerDied","Data":"b97994d2abd832cfb53297d1b107906d1bad7e8b63a231f4113d01a9f7f174a9"} Mar 20 13:49:19 crc kubenswrapper[4856]: I0320 13:49:19.971026 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:19.992293 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.021461 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.029480 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts\") pod \"83848803-038e-4b5a-b161-9f20a629ae9a\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.029571 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mc7m\" (UniqueName: \"kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m\") pod \"83848803-038e-4b5a-b161-9f20a629ae9a\" (UID: \"83848803-038e-4b5a-b161-9f20a629ae9a\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.030027 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8hhp\" (UniqueName: \"kubernetes.io/projected/d744a3b5-7023-416f-85cf-62400a452558-kube-api-access-m8hhp\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.030043 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d744a3b5-7023-416f-85cf-62400a452558-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.030053 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.030061 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz2hj\" (UniqueName: \"kubernetes.io/projected/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72-kube-api-access-cz2hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.030465 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83848803-038e-4b5a-b161-9f20a629ae9a" (UID: "83848803-038e-4b5a-b161-9f20a629ae9a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.042077 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m" (OuterVolumeSpecName: "kube-api-access-7mc7m") pod "83848803-038e-4b5a-b161-9f20a629ae9a" (UID: "83848803-038e-4b5a-b161-9f20a629ae9a"). InnerVolumeSpecName "kube-api-access-7mc7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.102318 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.110034 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db61-account-create-update-qs8jw"] Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.130830 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs\") pod \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.130895 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.130920 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.130942 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.130984 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131029 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2k7\" (UniqueName: \"kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7\") pod \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131053 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckhtx\" (UniqueName: \"kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131109 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131151 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131206 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle\") pod \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131227 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131244 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id\") pod \"4a53fecc-3af1-4ced-acd9-198296d50771\" (UID: \"4a53fecc-3af1-4ced-acd9-198296d50771\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.131365 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config\") pod \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\" (UID: \"82ca80b6-bf8a-4741-a5e0-059f20fae69b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.132349 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.133100 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs" (OuterVolumeSpecName: "logs") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.133510 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83848803-038e-4b5a-b161-9f20a629ae9a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.133584 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mc7m\" (UniqueName: \"kubernetes.io/projected/83848803-038e-4b5a-b161-9f20a629ae9a-kube-api-access-7mc7m\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.133643 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a53fecc-3af1-4ced-acd9-198296d50771-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.133702 4856 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a53fecc-3af1-4ced-acd9-198296d50771-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.139128 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.156843 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.171097 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.181961 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-e816-account-create-update-84wt2"] Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.186949 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts" (OuterVolumeSpecName: "scripts") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.186998 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7" (OuterVolumeSpecName: "kube-api-access-bf2k7") pod "82ca80b6-bf8a-4741-a5e0-059f20fae69b" (UID: "82ca80b6-bf8a-4741-a5e0-059f20fae69b"). InnerVolumeSpecName "kube-api-access-bf2k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.187100 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx" (OuterVolumeSpecName: "kube-api-access-ckhtx") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "kube-api-access-ckhtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.195655 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82ca80b6-bf8a-4741-a5e0-059f20fae69b" (UID: "82ca80b6-bf8a-4741-a5e0-059f20fae69b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.218554 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "82ca80b6-bf8a-4741-a5e0-059f20fae69b" (UID: "82ca80b6-bf8a-4741-a5e0-059f20fae69b"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.222733 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235355 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235406 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjn9p\" (UniqueName: \"kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.235481 4856 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235528 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235541 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.235557 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:21.235539804 +0000 UTC m=+1576.116565934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : configmap "openstack-scripts" not found Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235587 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2k7\" (UniqueName: \"kubernetes.io/projected/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-api-access-bf2k7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235600 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckhtx\" (UniqueName: \"kubernetes.io/projected/4a53fecc-3af1-4ced-acd9-198296d50771-kube-api-access-ckhtx\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235613 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235622 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.235635 4856 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.241402 4856 projected.go:194] Error preparing data for projected volume kube-api-access-sjn9p for pod openstack/keystone-4b4e-account-create-update-b95jl: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.241473 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:21.241455361 +0000 UTC m=+1576.122481491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjn9p" (UniqueName: "kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.262178 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "82ca80b6-bf8a-4741-a5e0-059f20fae69b" (UID: "82ca80b6-bf8a-4741-a5e0-059f20fae69b"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.276424 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.340382 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.340898 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgcm4\" (UniqueName: \"kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.340953 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341010 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341072 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341211 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341249 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341329 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs\") pod \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\" (UID: \"a70b9b91-b663-40a8-a2a8-f1f57fc17bab\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341749 4856 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/82ca80b6-bf8a-4741-a5e0-059f20fae69b-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341769 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.341780 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.343799 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs" (OuterVolumeSpecName: "logs") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.354826 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts" (OuterVolumeSpecName: "scripts") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.354837 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data" (OuterVolumeSpecName: "config-data") pod "4a53fecc-3af1-4ced-acd9-198296d50771" (UID: "4a53fecc-3af1-4ced-acd9-198296d50771"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.358692 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4" (OuterVolumeSpecName: "kube-api-access-jgcm4") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "kube-api-access-jgcm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.410037 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data" (OuterVolumeSpecName: "config-data") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.449567 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgcm4\" (UniqueName: \"kubernetes.io/projected/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-kube-api-access-jgcm4\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.449633 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.449649 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.449663 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a53fecc-3af1-4ced-acd9-198296d50771-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.449679 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.462103 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.504799 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.524259 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.526703 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.526735 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.531853 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.531941 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.531979 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.533926 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.534005 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.546554 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a70b9b91-b663-40a8-a2a8-f1f57fc17bab" (UID: "a70b9b91-b663-40a8-a2a8-f1f57fc17bab"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.551693 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.551725 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.551739 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a70b9b91-b663-40a8-a2a8-f1f57fc17bab-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.555277 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6c47c6db4b-7s8m7" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34996->10.217.0.166:9311: read: connection reset by peer" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.555539 4856 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-6c47c6db4b-7s8m7" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.166:9311/healthcheck\": read tcp 10.217.0.2:34998->10.217.0.166:9311: read: connection reset by peer" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.591991 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.629158 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.638988 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.702973 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.720302 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753774 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753831 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753883 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753924 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753944 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753974 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6skj\" (UniqueName: \"kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.753995 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d4cf\" (UniqueName: \"kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf\") pod \"bca1680a-2f52-465d-83e2-93fcbf318e19\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754017 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754042 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754069 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754099 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754115 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgr56\" (UniqueName: \"kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754151 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754169 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754188 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run\") pod \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\" (UID: \"b65360e6-90a7-4a8e-8647-6239e7c52e5b\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754228 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts\") pod \"bca1680a-2f52-465d-83e2-93fcbf318e19\" (UID: \"bca1680a-2f52-465d-83e2-93fcbf318e19\") " Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.754246 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle\") pod \"9010796b-5362-4885-8a2c-19668efe6e25\" (UID: \"9010796b-5362-4885-8a2c-19668efe6e25\") " Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.754533 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:49:20 crc kubenswrapper[4856]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[/bin/sh -c #!/bin/bash Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: if [ -n "" ]; then Mar 20 13:49:20 crc kubenswrapper[4856]: GRANT_DATABASE="" Mar 20 13:49:20 crc kubenswrapper[4856]: else Mar 20 13:49:20 crc kubenswrapper[4856]: GRANT_DATABASE="*" Mar 20 13:49:20 crc kubenswrapper[4856]: fi Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: # going for maximum compatibility here: Mar 20 13:49:20 crc kubenswrapper[4856]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 20 13:49:20 crc kubenswrapper[4856]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 20 13:49:20 crc kubenswrapper[4856]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 20 13:49:20 crc kubenswrapper[4856]: # support updates Mar 20 13:49:20 crc kubenswrapper[4856]: Mar 20 13:49:20 crc kubenswrapper[4856]: $MYSQL_CMD < logger="UnhandledError" Mar 20 13:49:20 crc kubenswrapper[4856]: E0320 13:49:20.755869 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-k8qzc" podUID="925c991c-480f-4c77-a47e-d669aaa6d3dd" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.756313 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.758103 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bca1680a-2f52-465d-83e2-93fcbf318e19" (UID: "bca1680a-2f52-465d-83e2-93fcbf318e19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.760460 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts" (OuterVolumeSpecName: "scripts") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.761758 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56" (OuterVolumeSpecName: "kube-api-access-qgr56") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "kube-api-access-qgr56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.762012 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.762255 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs" (OuterVolumeSpecName: "logs") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.763292 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs" (OuterVolumeSpecName: "logs") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4856]: I0320 13:49:20.763844 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.766252 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts" (OuterVolumeSpecName: "scripts") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.767478 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.769434 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf" (OuterVolumeSpecName: "kube-api-access-7d4cf") pod "bca1680a-2f52-465d-83e2-93fcbf318e19" (UID: "bca1680a-2f52-465d-83e2-93fcbf318e19"). InnerVolumeSpecName "kube-api-access-7d4cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.778516 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj" (OuterVolumeSpecName: "kube-api-access-c6skj") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "kube-api-access-c6skj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.807509 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.815564 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.855966 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpzfp\" (UniqueName: \"kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp\") pod \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856037 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data\") pod \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856088 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs\") pod \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856133 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle\") pod \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856166 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs\") pod \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\" (UID: \"467cf6ce-9c87-45d6-9968-4d5372f70cb3\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856664 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856681 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856700 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856710 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6skj\" (UniqueName: \"kubernetes.io/projected/b65360e6-90a7-4a8e-8647-6239e7c52e5b-kube-api-access-c6skj\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856721 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d4cf\" (UniqueName: \"kubernetes.io/projected/bca1680a-2f52-465d-83e2-93fcbf318e19-kube-api-access-7d4cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856730 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856738 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856748 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9010796b-5362-4885-8a2c-19668efe6e25-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856760 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856770 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgr56\" (UniqueName: \"kubernetes.io/projected/9010796b-5362-4885-8a2c-19668efe6e25-kube-api-access-qgr56\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856778 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856787 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856796 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b65360e6-90a7-4a8e-8647-6239e7c52e5b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.856807 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bca1680a-2f52-465d-83e2-93fcbf318e19-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.858185 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs" (OuterVolumeSpecName: "logs") pod "467cf6ce-9c87-45d6-9968-4d5372f70cb3" (UID: "467cf6ce-9c87-45d6-9968-4d5372f70cb3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.864566 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp" (OuterVolumeSpecName: "kube-api-access-hpzfp") pod "467cf6ce-9c87-45d6-9968-4d5372f70cb3" (UID: "467cf6ce-9c87-45d6-9968-4d5372f70cb3"). InnerVolumeSpecName "kube-api-access-hpzfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.866425 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.870519 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data" (OuterVolumeSpecName: "config-data") pod "b65360e6-90a7-4a8e-8647-6239e7c52e5b" (UID: "b65360e6-90a7-4a8e-8647-6239e7c52e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.884254 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.888744 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.889525 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.894100 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data" (OuterVolumeSpecName: "config-data") pod "467cf6ce-9c87-45d6-9968-4d5372f70cb3" (UID: "467cf6ce-9c87-45d6-9968-4d5372f70cb3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.897727 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data" (OuterVolumeSpecName: "config-data") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.929339 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9010796b-5362-4885-8a2c-19668efe6e25" (UID: "9010796b-5362-4885-8a2c-19668efe6e25"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.942143 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "467cf6ce-9c87-45d6-9968-4d5372f70cb3" (UID: "467cf6ce-9c87-45d6-9968-4d5372f70cb3"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.956603 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6441-account-create-update-bl9lv" event={"ID":"6eaf0274-e5e2-4ece-868f-2e7ef81d0e72","Type":"ContainerDied","Data":"5af38e17ef595e03b168b32814e4ef3a88f3cbda06209ce1471a404be0c6697f"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.956701 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6441-account-create-update-bl9lv" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958744 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958787 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958797 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958807 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpzfp\" (UniqueName: \"kubernetes.io/projected/467cf6ce-9c87-45d6-9968-4d5372f70cb3-kube-api-access-hpzfp\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958817 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9010796b-5362-4885-8a2c-19668efe6e25-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958824 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958834 4856 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958867 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958878 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b65360e6-90a7-4a8e-8647-6239e7c52e5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.958885 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/467cf6ce-9c87-45d6-9968-4d5372f70cb3-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.961609 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "467cf6ce-9c87-45d6-9968-4d5372f70cb3" (UID: "467cf6ce-9c87-45d6-9968-4d5372f70cb3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.962313 4856 generic.go:334] "Generic (PLEG): container finished" podID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerID="bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3" exitCode=0 Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.962394 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerDied","Data":"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.962425 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"467cf6ce-9c87-45d6-9968-4d5372f70cb3","Type":"ContainerDied","Data":"6b2379550dbcddc223c0fb22f39d2469bc815db27958782addc2524e5370a8ad"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.962443 4856 scope.go:117] "RemoveContainer" containerID="bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.962572 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.970782 4856 generic.go:334] "Generic (PLEG): container finished" podID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerID="1921c9edd328e3c015ebb3e3c66b2a013cbe2bfbc222ca92d31d2027cab3c79d" exitCode=0 Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.970851 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerDied","Data":"1921c9edd328e3c015ebb3e3c66b2a013cbe2bfbc222ca92d31d2027cab3c79d"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.973384 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b65360e6-90a7-4a8e-8647-6239e7c52e5b","Type":"ContainerDied","Data":"3be30f93f31705feb401b66300fbc86df2ab10e4f07b8260acddf826bc38ec74"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.973464 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.975498 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k8qzc" event={"ID":"925c991c-480f-4c77-a47e-d669aaa6d3dd","Type":"ContainerStarted","Data":"bdcc738b3a4177d25654fbf1c9d6e5ffa464aa212455dc311992d4287ce2f83c"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.978092 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9010796b-5362-4885-8a2c-19668efe6e25","Type":"ContainerDied","Data":"4875c22fe0a8b278ce840b2edb9f1d4e9e6d4ba220a514f94ed92626dd49247a"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.978143 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.989793 4856 generic.go:334] "Generic (PLEG): container finished" podID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerID="a9aa0ace58f4d55b503d781dd43d4983ad8cbc31b36675ba56fa2bddfba479db" exitCode=0 Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.989854 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerDied","Data":"a9aa0ace58f4d55b503d781dd43d4983ad8cbc31b36675ba56fa2bddfba479db"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.995318 4856 generic.go:334] "Generic (PLEG): container finished" podID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerID="c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e" exitCode=0 Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.995405 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.995433 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerDied","Data":"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.995463 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dd2bd8e2-7f52-4c35-ac1d-f1175581a751","Type":"ContainerDied","Data":"15b649f7b1ee27f41c296ceed5d90c6f5d98f67d5a3da929931045f2a11a0752"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.997403 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54db87fb-r8q4w" event={"ID":"a70b9b91-b663-40a8-a2a8-f1f57fc17bab","Type":"ContainerDied","Data":"e62cc3f6af91da5a8582e75d8459868ec9f682889b5dd798e4b76298a0ae5600"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:20.997509 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54db87fb-r8q4w" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.001083 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.002882 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4a53fecc-3af1-4ced-acd9-198296d50771","Type":"ContainerDied","Data":"4951677a866d1ded7fb56ec01c6b45fae7a6b5bb13d0240b9af5a5755413fb4f"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.012167 4856 scope.go:117] "RemoveContainer" containerID="70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.014507 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"82ca80b6-bf8a-4741-a5e0-059f20fae69b","Type":"ContainerDied","Data":"04bc4b2c7647c23ac5a613af742f84aa6b715cccf702fa9771892e53d2a1c079"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.014657 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.017014 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23fc74c5-121e-4ac1-8d50-8be3393d080a/ovn-northd/0.log" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.017051 4856 generic.go:334] "Generic (PLEG): container finished" podID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerID="9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" exitCode=139 Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.017105 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerDied","Data":"9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.027482 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.028318 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-812f-account-create-update-4crpm" event={"ID":"83848803-038e-4b5a-b161-9f20a629ae9a","Type":"ContainerDied","Data":"f3a4edae93642c1b8ab80ce0a7dbbb635d1751a55b5e9f307d5cfb54976e7efb"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.028678 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-812f-account-create-update-4crpm" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.031127 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.031284 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-tvqkz" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.031370 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-tvqkz" event={"ID":"bca1680a-2f52-465d-83e2-93fcbf318e19","Type":"ContainerDied","Data":"f1fe12659f7f3ba5a58b2518fb6116e39e5600c51db6e53425afeb9e8269f917"} Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.038766 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6441-account-create-update-bl9lv"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.046830 4856 scope.go:117] "RemoveContainer" containerID="bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3" Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.047323 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3\": container with ID starting with bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3 not found: ID does not exist" containerID="bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.047360 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3"} err="failed to get container status \"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3\": rpc error: code = NotFound desc = could not find container \"bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3\": container with ID starting with bc15556b44b7e7cf081e344c91774eb2094052c2c8d2d4bc3595fcf7d5ecc0d3 not found: ID does not exist" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.047386 4856 scope.go:117] "RemoveContainer" containerID="70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4" Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.047870 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4\": container with ID starting with 70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4 not found: ID does not exist" containerID="70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.047897 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4"} err="failed to get container status \"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4\": rpc error: code = NotFound desc = could not find container \"70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4\": container with ID starting with 70f368f30b4b7c0df1f15e77b7609b001ab15002221793913f183d5819c597a4 not found: ID does not exist" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.047914 4856 scope.go:117] "RemoveContainer" containerID="2e30bbf7e5c9ce212c4664a13de9d24567d97e9650bc51aebc00393f15f7368c" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.059819 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.059863 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.059908 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.059956 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.060078 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.060133 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.060175 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.060943 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dj6b\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.060985 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.061034 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.061063 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf\") pod \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\" (UID: \"dd2bd8e2-7f52-4c35-ac1d-f1175581a751\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.061725 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.062147 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.062446 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/467cf6ce-9c87-45d6-9968-4d5372f70cb3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.062952 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.065215 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info" (OuterVolumeSpecName: "pod-info") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.076817 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.079454 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.085845 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.086872 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.100127 4856 scope.go:117] "RemoveContainer" containerID="eab1a972ced564d26d0363b491849a05a6f3e49a90ed55b28b5329a2b2fb593a" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.104765 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b" (OuterVolumeSpecName: "kube-api-access-8dj6b") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "kube-api-access-8dj6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.109527 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.112371 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data" (OuterVolumeSpecName: "config-data") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.122498 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.126755 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf" (OuterVolumeSpecName: "server-conf") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.127173 4856 scope.go:117] "RemoveContainer" containerID="5bb29030c4a50eae6ca1db03ef400e510392fd28217af8dc0f5c5c0444dfd46e" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.129030 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.142117 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.151919 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.160262 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.167941 4856 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.167982 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.167992 4856 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168001 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168010 4856 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168018 4856 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168027 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168035 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dj6b\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-kube-api-access-8dj6b\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168043 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.168052 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.169987 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.176849 4856 scope.go:117] "RemoveContainer" containerID="32c113b6f7715bc1e4450f3f402b1f997eacbd1f27bf889a91c9380bda22c42d" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.181243 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-54db87fb-r8q4w"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.205590 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.206302 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.238569 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.269630 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.269584 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270319 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270413 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270566 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270692 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9ng8\" (UniqueName: \"kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270822 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.270907 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs\") pod \"c0be3924-19c6-4eee-bc60-7fbe28336b67\" (UID: \"c0be3924-19c6-4eee-bc60-7fbe28336b67\") " Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.271379 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.271455 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjn9p\" (UniqueName: \"kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p\") pod \"keystone-4b4e-account-create-update-b95jl\" (UID: \"bd4e74c5-7795-4fcd-ba5f-11cd6a05575d\") " pod="openstack/keystone-4b4e-account-create-update-b95jl" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.271686 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.275399 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.275513 4856 projected.go:194] Error preparing data for projected volume kube-api-access-sjn9p for pod openstack/keystone-4b4e-account-create-update-b95jl: failed to fetch token: pod "keystone-4b4e-account-create-update-b95jl" not found Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.275576 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:23.275557264 +0000 UTC m=+1578.156583394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sjn9p" (UniqueName: "kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : failed to fetch token: pod "keystone-4b4e-account-create-update-b95jl" not found Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.276087 4856 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.276130 4856 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts podName:bd4e74c5-7795-4fcd-ba5f-11cd6a05575d nodeName:}" failed. No retries permitted until 2026-03-20 13:49:23.27611756 +0000 UTC m=+1578.157143690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts") pod "keystone-4b4e-account-create-update-b95jl" (UID: "bd4e74c5-7795-4fcd-ba5f-11cd6a05575d") : configmap "openstack-scripts" not found Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.282123 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dd2bd8e2-7f52-4c35-ac1d-f1175581a751" (UID: "dd2bd8e2-7f52-4c35-ac1d-f1175581a751"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.282412 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8" (OuterVolumeSpecName: "kube-api-access-b9ng8") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "kube-api-access-b9ng8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.282682 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs" (OuterVolumeSpecName: "logs") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.288487 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.307575 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.319331 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-812f-account-create-update-4crpm"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.329678 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.329784 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.341465 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.347659 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-tvqkz"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.347955 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data" (OuterVolumeSpecName: "config-data") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.368516 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-4b4e-account-create-update-b95jl"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.372732 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-4b4e-account-create-update-b95jl"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373232 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c0be3924-19c6-4eee-bc60-7fbe28336b67-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373253 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9ng8\" (UniqueName: \"kubernetes.io/projected/c0be3924-19c6-4eee-bc60-7fbe28336b67-kube-api-access-b9ng8\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373284 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373294 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373303 4856 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dd2bd8e2-7f52-4c35-ac1d-f1175581a751-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373311 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.373319 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.400656 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c0be3924-19c6-4eee-bc60-7fbe28336b67" (UID: "c0be3924-19c6-4eee-bc60-7fbe28336b67"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.475528 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjn9p\" (UniqueName: \"kubernetes.io/projected/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-kube-api-access-sjn9p\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.475561 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c0be3924-19c6-4eee-bc60-7fbe28336b67-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.475589 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.588191 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.591236 4856 scope.go:117] "RemoveContainer" containerID="c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.596317 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.665364 4856 scope.go:117] "RemoveContainer" containerID="9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.690363 4856 scope.go:117] "RemoveContainer" containerID="c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e" Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.690703 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e\": container with ID starting with c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e not found: ID does not exist" containerID="c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.690729 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e"} err="failed to get container status \"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e\": rpc error: code = NotFound desc = could not find container \"c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e\": container with ID starting with c073fff9a2fb43f7feb3180657bd9b80ddd736f6479a6feb9808544981ba390e not found: ID does not exist" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.690747 4856 scope.go:117] "RemoveContainer" containerID="9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630" Mar 20 13:49:21 crc kubenswrapper[4856]: E0320 13:49:21.690921 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630\": container with ID starting with 9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630 not found: ID does not exist" containerID="9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.690937 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630"} err="failed to get container status \"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630\": rpc error: code = NotFound desc = could not find container \"9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630\": container with ID starting with 9f01d0b568a78eeec105165652349fe7777a2cc32213ad8ae85ee9060faa0630 not found: ID does not exist" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.690972 4856 scope.go:117] "RemoveContainer" containerID="89e762b7150be959d006f5453a95ec72bc93fd09878cdcfd7e75bf3a0eb48e5a" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.755362 4856 scope.go:117] "RemoveContainer" containerID="1851f20ad50d19aed32d3be103e4e1bc3e4b3415498ed43e1a10c91964f72276" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.797811 4856 scope.go:117] "RemoveContainer" containerID="609ee66b8ffd12f755f76cd9565ffad47cf29b376b4955ce6c3cee2c6b3c9d01" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.826878 4856 scope.go:117] "RemoveContainer" containerID="98f8a872358d11589374e68a262e864329cd9ae8329df4f8a4f3f630a5b9881f" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.835706 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" path="/var/lib/kubelet/pods/467cf6ce-9c87-45d6-9968-4d5372f70cb3/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.836571 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" path="/var/lib/kubelet/pods/4a53fecc-3af1-4ced-acd9-198296d50771/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.837791 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eaf0274-e5e2-4ece-868f-2e7ef81d0e72" path="/var/lib/kubelet/pods/6eaf0274-e5e2-4ece-868f-2e7ef81d0e72/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.838962 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" path="/var/lib/kubelet/pods/82ca80b6-bf8a-4741-a5e0-059f20fae69b/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.842675 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83848803-038e-4b5a-b161-9f20a629ae9a" path="/var/lib/kubelet/pods/83848803-038e-4b5a-b161-9f20a629ae9a/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.843634 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9010796b-5362-4885-8a2c-19668efe6e25" path="/var/lib/kubelet/pods/9010796b-5362-4885-8a2c-19668efe6e25/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.844682 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" path="/var/lib/kubelet/pods/a70b9b91-b663-40a8-a2a8-f1f57fc17bab/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.846415 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" path="/var/lib/kubelet/pods/b65360e6-90a7-4a8e-8647-6239e7c52e5b/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.847486 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca1680a-2f52-465d-83e2-93fcbf318e19" path="/var/lib/kubelet/pods/bca1680a-2f52-465d-83e2-93fcbf318e19/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.847932 4856 scope.go:117] "RemoveContainer" containerID="b6968ce7333be8a128c62fc951855092f3670f6f1c40b19c0d3cebdfeb3e52d2" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.848157 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4e74c5-7795-4fcd-ba5f-11cd6a05575d" path="/var/lib/kubelet/pods/bd4e74c5-7795-4fcd-ba5f-11cd6a05575d/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.848694 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a2b3f2-ab65-4013-9aa8-66d38474c2ab" path="/var/lib/kubelet/pods/c3a2b3f2-ab65-4013-9aa8-66d38474c2ab/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.849407 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d744a3b5-7023-416f-85cf-62400a452558" path="/var/lib/kubelet/pods/d744a3b5-7023-416f-85cf-62400a452558/volumes" Mar 20 13:49:21 crc kubenswrapper[4856]: I0320 13:49:21.851576 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" path="/var/lib/kubelet/pods/dd2bd8e2-7f52-4c35-ac1d-f1175581a751/volumes" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.057063 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6c47c6db4b-7s8m7" event={"ID":"c0be3924-19c6-4eee-bc60-7fbe28336b67","Type":"ContainerDied","Data":"0e50e5a77f055c3cf693267e96a1e2305580bf0c20c030307a6e3c0f5314197d"} Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.057097 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6c47c6db4b-7s8m7" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.057184 4856 scope.go:117] "RemoveContainer" containerID="a9aa0ace58f4d55b503d781dd43d4983ad8cbc31b36675ba56fa2bddfba479db" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.069499 4856 generic.go:334] "Generic (PLEG): container finished" podID="628ad6bb-ab51-4021-9757-4247a1ccfa71" containerID="064628587a2ade55cadbdf45c06311c4f3381732bc6b09b5dbd0ce453ed18ef9" exitCode=0 Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.069677 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"628ad6bb-ab51-4021-9757-4247a1ccfa71","Type":"ContainerDied","Data":"064628587a2ade55cadbdf45c06311c4f3381732bc6b09b5dbd0ce453ed18ef9"} Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.087702 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.089322 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerDied","Data":"776beaea3c9377bde7e78faa04866103da0c3428a4e7fd231cab6e96b2a77971"} Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.090826 4856 generic.go:334] "Generic (PLEG): container finished" podID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerID="776beaea3c9377bde7e78faa04866103da0c3428a4e7fd231cab6e96b2a77971" exitCode=0 Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.094054 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6c47c6db4b-7s8m7"] Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.099597 4856 scope.go:117] "RemoveContainer" containerID="39d74692a61d64ac3b4733a0a061eddb4db4c4aede15d1da75d20e1e1dc827fa" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.099722 4856 generic.go:334] "Generic (PLEG): container finished" podID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerID="0dba6d4b780a407897cd32686827c4483d0924be6ffdda04151d3f6aaee1a114" exitCode=0 Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.099773 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerDied","Data":"0dba6d4b780a407897cd32686827c4483d0924be6ffdda04151d3f6aaee1a114"} Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.102337 4856 generic.go:334] "Generic (PLEG): container finished" podID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerID="77b452f8fbc35a66da9a652896b32275d2353cdfdc4f3d95fb386acc911a6b89" exitCode=0 Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.102385 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerDied","Data":"77b452f8fbc35a66da9a652896b32275d2353cdfdc4f3d95fb386acc911a6b89"} Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.161194 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.163039 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.164261 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.164320 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerName="nova-cell1-conductor-conductor" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.450317 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.470364 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.478573 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.485940 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.500580 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.506126 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23fc74c5-121e-4ac1-8d50-8be3393d080a/ovn-northd/0.log" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.506188 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.602928 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc9bw\" (UniqueName: \"kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.602972 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603000 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603023 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603046 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603063 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603082 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kccnr\" (UniqueName: \"kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr\") pod \"925c991c-480f-4c77-a47e-d669aaa6d3dd\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603101 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts\") pod \"925c991c-480f-4c77-a47e-d669aaa6d3dd\" (UID: \"925c991c-480f-4c77-a47e-d669aaa6d3dd\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603121 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603153 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config\") pod \"628ad6bb-ab51-4021-9757-4247a1ccfa71\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603175 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603198 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603212 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603232 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603256 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data\") pod \"628ad6bb-ab51-4021-9757-4247a1ccfa71\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603295 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs\") pod \"628ad6bb-ab51-4021-9757-4247a1ccfa71\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603328 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbnnl\" (UniqueName: \"kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603368 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603383 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603416 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghfzk\" (UniqueName: \"kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603430 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603450 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle\") pod \"628ad6bb-ab51-4021-9757-4247a1ccfa71\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603477 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs\") pod \"050eefc7-c113-4198-b1ad-0645ad765a2a\" (UID: \"050eefc7-c113-4198-b1ad-0645ad765a2a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603495 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603511 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4kv\" (UniqueName: \"kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv\") pod \"628ad6bb-ab51-4021-9757-4247a1ccfa71\" (UID: \"628ad6bb-ab51-4021-9757-4247a1ccfa71\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603534 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data\") pod \"8748e306-2876-434d-abef-f7d9cd7c7a07\" (UID: \"8748e306-2876-434d-abef-f7d9cd7c7a07\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.603552 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data\") pod \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\" (UID: \"2cf0465b-c48d-4c35-8e65-3f82c517ad98\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.604344 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.605001 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs" (OuterVolumeSpecName: "logs") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.605129 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "925c991c-480f-4c77-a47e-d669aaa6d3dd" (UID: "925c991c-480f-4c77-a47e-d669aaa6d3dd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.607672 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.608240 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "628ad6bb-ab51-4021-9757-4247a1ccfa71" (UID: "628ad6bb-ab51-4021-9757-4247a1ccfa71"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.610930 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr" (OuterVolumeSpecName: "kube-api-access-kccnr") pod "925c991c-480f-4c77-a47e-d669aaa6d3dd" (UID: "925c991c-480f-4c77-a47e-d669aaa6d3dd"). InnerVolumeSpecName "kube-api-access-kccnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.611365 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts" (OuterVolumeSpecName: "scripts") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.614652 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw" (OuterVolumeSpecName: "kube-api-access-tc9bw") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "kube-api-access-tc9bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.617621 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts" (OuterVolumeSpecName: "scripts") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.618363 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data" (OuterVolumeSpecName: "config-data") pod "628ad6bb-ab51-4021-9757-4247a1ccfa71" (UID: "628ad6bb-ab51-4021-9757-4247a1ccfa71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.620518 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl" (OuterVolumeSpecName: "kube-api-access-hbnnl") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "kube-api-access-hbnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.623102 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.623182 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk" (OuterVolumeSpecName: "kube-api-access-ghfzk") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "kube-api-access-ghfzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.625315 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv" (OuterVolumeSpecName: "kube-api-access-wz4kv") pod "628ad6bb-ab51-4021-9757-4247a1ccfa71" (UID: "628ad6bb-ab51-4021-9757-4247a1ccfa71"). InnerVolumeSpecName "kube-api-access-wz4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.626385 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.637895 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.656076 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "628ad6bb-ab51-4021-9757-4247a1ccfa71" (UID: "628ad6bb-ab51-4021-9757-4247a1ccfa71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.679124 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.698017 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data" (OuterVolumeSpecName: "config-data") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.698517 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.700379 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704661 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704730 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704774 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704822 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704890 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhr6h\" (UniqueName: \"kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704950 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.704987 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs\") pod \"23fc74c5-121e-4ac1-8d50-8be3393d080a\" (UID: \"23fc74c5-121e-4ac1-8d50-8be3393d080a\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705502 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghfzk\" (UniqueName: \"kubernetes.io/projected/050eefc7-c113-4198-b1ad-0645ad765a2a-kube-api-access-ghfzk\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705524 4856 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705536 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705549 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705561 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz4kv\" (UniqueName: \"kubernetes.io/projected/628ad6bb-ab51-4021-9757-4247a1ccfa71-kube-api-access-wz4kv\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705572 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705583 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc9bw\" (UniqueName: \"kubernetes.io/projected/2cf0465b-c48d-4c35-8e65-3f82c517ad98-kube-api-access-tc9bw\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705593 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705603 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705623 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kccnr\" (UniqueName: \"kubernetes.io/projected/925c991c-480f-4c77-a47e-d669aaa6d3dd-kube-api-access-kccnr\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705631 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925c991c-480f-4c77-a47e-d669aaa6d3dd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705642 4856 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8748e306-2876-434d-abef-f7d9cd7c7a07-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705654 4856 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705664 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705674 4856 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/050eefc7-c113-4198-b1ad-0645ad765a2a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705684 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cf0465b-c48d-4c35-8e65-3f82c517ad98-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705694 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/628ad6bb-ab51-4021-9757-4247a1ccfa71-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705705 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbnnl\" (UniqueName: \"kubernetes.io/projected/8748e306-2876-434d-abef-f7d9cd7c7a07-kube-api-access-hbnnl\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705716 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705726 4856 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.705782 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.706160 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts" (OuterVolumeSpecName: "scripts") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.706553 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config" (OuterVolumeSpecName: "config") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.707960 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h" (OuterVolumeSpecName: "kube-api-access-xhr6h") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "kube-api-access-xhr6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.712916 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.732917 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "628ad6bb-ab51-4021-9757-4247a1ccfa71" (UID: "628ad6bb-ab51-4021-9757-4247a1ccfa71"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.763825 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.765645 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2cf0465b-c48d-4c35-8e65-3f82c517ad98" (UID: "2cf0465b-c48d-4c35-8e65-3f82c517ad98"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.777943 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.779378 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.798106 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.799536 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.800719 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:22 crc kubenswrapper[4856]: E0320 13:49:22.800834 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.806655 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.806806 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.806914 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.807049 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.807182 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.809938 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810049 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810138 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmh7x\" (UniqueName: \"kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x\") pod \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\" (UID: \"4397f29e-c0c9-4726-8fb4-1afe1441ec83\") " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810500 4856 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/628ad6bb-ab51-4021-9757-4247a1ccfa71-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810558 4856 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810608 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810658 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810716 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23fc74c5-121e-4ac1-8d50-8be3393d080a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810778 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810839 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2cf0465b-c48d-4c35-8e65-3f82c517ad98-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810887 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhr6h\" (UniqueName: \"kubernetes.io/projected/23fc74c5-121e-4ac1-8d50-8be3393d080a-kube-api-access-xhr6h\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810934 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.810989 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.814895 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.815452 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.815577 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.816675 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.817231 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x" (OuterVolumeSpecName: "kube-api-access-qmh7x") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "kube-api-access-qmh7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.824972 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.825559 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data" (OuterVolumeSpecName: "config-data") pod "8748e306-2876-434d-abef-f7d9cd7c7a07" (UID: "8748e306-2876-434d-abef-f7d9cd7c7a07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.826761 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.827995 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data" (OuterVolumeSpecName: "config-data") pod "050eefc7-c113-4198-b1ad-0645ad765a2a" (UID: "050eefc7-c113-4198-b1ad-0645ad765a2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.841638 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.850327 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "23fc74c5-121e-4ac1-8d50-8be3393d080a" (UID: "23fc74c5-121e-4ac1-8d50-8be3393d080a"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.857744 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "4397f29e-c0c9-4726-8fb4-1afe1441ec83" (UID: "4397f29e-c0c9-4726-8fb4-1afe1441ec83"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912420 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912459 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8748e306-2876-434d-abef-f7d9cd7c7a07-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912497 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912509 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912523 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/050eefc7-c113-4198-b1ad-0645ad765a2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912534 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912548 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmh7x\" (UniqueName: \"kubernetes.io/projected/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kube-api-access-qmh7x\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912560 4856 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912571 4856 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4397f29e-c0c9-4726-8fb4-1afe1441ec83-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912582 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912594 4856 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4397f29e-c0c9-4726-8fb4-1afe1441ec83-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.912610 4856 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23fc74c5-121e-4ac1-8d50-8be3393d080a-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:22 crc kubenswrapper[4856]: I0320 13:49:22.933033 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.013701 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.116938 4856 generic.go:334] "Generic (PLEG): container finished" podID="1ac0adc6-d09a-4367-838e-67f78ae5a050" containerID="4eb1f6a354f4bcf7cd6a7759bd2b31a120d7b8a455f854f1604a17e887048c77" exitCode=0 Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.116996 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c6b6b7976-vc6rm" event={"ID":"1ac0adc6-d09a-4367-838e-67f78ae5a050","Type":"ContainerDied","Data":"4eb1f6a354f4bcf7cd6a7759bd2b31a120d7b8a455f854f1604a17e887048c77"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.122293 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23fc74c5-121e-4ac1-8d50-8be3393d080a/ovn-northd/0.log" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.122378 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23fc74c5-121e-4ac1-8d50-8be3393d080a","Type":"ContainerDied","Data":"5e13e39e09955b323bc7368d5da64b824f5d4c3bafd101b5ad4da1eb8c2c51c1"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.122421 4856 scope.go:117] "RemoveContainer" containerID="00f17ef2ce0ea5d744a6efd0758584d67cfece02bc7267d434c74cf910e4020f" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.122546 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.126901 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"628ad6bb-ab51-4021-9757-4247a1ccfa71","Type":"ContainerDied","Data":"f7fd88f425e651fc749f100ff75436f9bfbac32dd824126f6ad05384008785de"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.127002 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.143232 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"050eefc7-c113-4198-b1ad-0645ad765a2a","Type":"ContainerDied","Data":"0623e18d53c925ec76df1668374aef11ee1ac56a741346455b4cf199d417a6b9"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.143360 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.174546 4856 generic.go:334] "Generic (PLEG): container finished" podID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerID="bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092" exitCode=0 Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.174645 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerDied","Data":"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.174672 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"4397f29e-c0c9-4726-8fb4-1afe1441ec83","Type":"ContainerDied","Data":"d7e2fd620db7fa74f03dd59aeb1eccd56caeea0f5e61b41c35f96ce465b1e491"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.174778 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.180120 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.185237 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k8qzc" event={"ID":"925c991c-480f-4c77-a47e-d669aaa6d3dd","Type":"ContainerDied","Data":"bdcc738b3a4177d25654fbf1c9d6e5ffa464aa212455dc311992d4287ce2f83c"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.186727 4856 scope.go:117] "RemoveContainer" containerID="9aa743cfc86a7c40aec2780f7f000f29b46afe48cf688f05641828a6fb282b69" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.186926 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k8qzc" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.194340 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8748e306-2876-434d-abef-f7d9cd7c7a07","Type":"ContainerDied","Data":"9fa9cf2114bf93f588f54c41845ed0f4c9ac74eb3965b3521d8055bcaaaa708a"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.194457 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.198983 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2cf0465b-c48d-4c35-8e65-3f82c517ad98","Type":"ContainerDied","Data":"6c08bf7ff547654111137122cc764a0291452db62e24c1bcbe531b1b0fa556a1"} Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.199139 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.202549 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.217321 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.225387 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.231941 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.243891 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.300709 4856 scope.go:117] "RemoveContainer" containerID="064628587a2ade55cadbdf45c06311c4f3381732bc6b09b5dbd0ce453ed18ef9" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.324945 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.337073 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.344328 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.352502 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.358136 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.361331 4856 scope.go:117] "RemoveContainer" containerID="88dd600e0e2eead445603b77f3af33d71b91561cfdbeb4eb78f38a3d301378d1" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.375755 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.392049 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.402960 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k8qzc"] Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.424153 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.431474 4856 scope.go:117] "RemoveContainer" containerID="8baa4310240d002ce370f1938841d57f264431b93149fa8dcd6abcd5b71b8287" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.453366 4856 scope.go:117] "RemoveContainer" containerID="776beaea3c9377bde7e78faa04866103da0c3428a4e7fd231cab6e96b2a77971" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.477056 4856 scope.go:117] "RemoveContainer" containerID="1921c9edd328e3c015ebb3e3c66b2a013cbe2bfbc222ca92d31d2027cab3c79d" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.504006 4856 scope.go:117] "RemoveContainer" containerID="bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.525167 4856 scope.go:117] "RemoveContainer" containerID="d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529027 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529176 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529214 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529346 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529391 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529414 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529467 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8v66\" (UniqueName: \"kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.529488 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle\") pod \"1ac0adc6-d09a-4367-838e-67f78ae5a050\" (UID: \"1ac0adc6-d09a-4367-838e-67f78ae5a050\") " Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.533966 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts" (OuterVolumeSpecName: "scripts") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.546859 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.549459 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66" (OuterVolumeSpecName: "kube-api-access-l8v66") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "kube-api-access-l8v66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.549557 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.566311 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data" (OuterVolumeSpecName: "config-data") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.579419 4856 scope.go:117] "RemoveContainer" containerID="bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092" Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.580497 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092\": container with ID starting with bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092 not found: ID does not exist" containerID="bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.580537 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092"} err="failed to get container status \"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092\": rpc error: code = NotFound desc = could not find container \"bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092\": container with ID starting with bcd7908adb4817f2bb22d448f131c4da0d4015068f47fcecb537041af94d1092 not found: ID does not exist" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.580564 4856 scope.go:117] "RemoveContainer" containerID="d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d" Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.581909 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d\": container with ID starting with d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d not found: ID does not exist" containerID="d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.581963 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d"} err="failed to get container status \"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d\": rpc error: code = NotFound desc = could not find container \"d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d\": container with ID starting with d32b1f690c0180689460496903d46e5af09df5a2238b78b187ed2a1633cc010d not found: ID does not exist" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.581992 4856 scope.go:117] "RemoveContainer" containerID="0537ae0b31b688d61e33527cfef4b3a828f73f588fd203119e3e0fc88c53d392" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.589169 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.607959 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.615884 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1ac0adc6-d09a-4367-838e-67f78ae5a050" (UID: "1ac0adc6-d09a-4367-838e-67f78ae5a050"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.637149 4856 scope.go:117] "RemoveContainer" containerID="0dba6d4b780a407897cd32686827c4483d0924be6ffdda04151d3f6aaee1a114" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641243 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641299 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641311 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641325 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8v66\" (UniqueName: \"kubernetes.io/projected/1ac0adc6-d09a-4367-838e-67f78ae5a050-kube-api-access-l8v66\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641337 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641347 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641357 4856 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.641366 4856 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1ac0adc6-d09a-4367-838e-67f78ae5a050-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.660602 4856 scope.go:117] "RemoveContainer" containerID="77b452f8fbc35a66da9a652896b32275d2353cdfdc4f3d95fb386acc911a6b89" Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.660587 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.662205 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.664912 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:49:23 crc kubenswrapper[4856]: E0320 13:49:23.664944 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.688361 4856 scope.go:117] "RemoveContainer" containerID="fdc9d6121689074032de5b8e98217043a6aac903a3a4183cd06f9641245a15c5" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.836251 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" path="/var/lib/kubelet/pods/050eefc7-c113-4198-b1ad-0645ad765a2a/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.837803 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" path="/var/lib/kubelet/pods/23fc74c5-121e-4ac1-8d50-8be3393d080a/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.840055 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" path="/var/lib/kubelet/pods/2cf0465b-c48d-4c35-8e65-3f82c517ad98/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.841645 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" path="/var/lib/kubelet/pods/4397f29e-c0c9-4726-8fb4-1afe1441ec83/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.843159 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628ad6bb-ab51-4021-9757-4247a1ccfa71" path="/var/lib/kubelet/pods/628ad6bb-ab51-4021-9757-4247a1ccfa71/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.845040 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" path="/var/lib/kubelet/pods/8748e306-2876-434d-abef-f7d9cd7c7a07/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.846081 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925c991c-480f-4c77-a47e-d669aaa6d3dd" path="/var/lib/kubelet/pods/925c991c-480f-4c77-a47e-d669aaa6d3dd/volumes" Mar 20 13:49:23 crc kubenswrapper[4856]: I0320 13:49:23.846831 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" path="/var/lib/kubelet/pods/c0be3924-19c6-4eee-bc60-7fbe28336b67/volumes" Mar 20 13:49:24 crc kubenswrapper[4856]: I0320 13:49:24.227546 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7c6b6b7976-vc6rm" event={"ID":"1ac0adc6-d09a-4367-838e-67f78ae5a050","Type":"ContainerDied","Data":"0b6e7120a465c758f2bb754355b306589c204a9f6ee82cd864aa258ed424cc02"} Mar 20 13:49:24 crc kubenswrapper[4856]: I0320 13:49:24.227594 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7c6b6b7976-vc6rm" Mar 20 13:49:24 crc kubenswrapper[4856]: I0320 13:49:24.227596 4856 scope.go:117] "RemoveContainer" containerID="4eb1f6a354f4bcf7cd6a7759bd2b31a120d7b8a455f854f1604a17e887048c77" Mar 20 13:49:24 crc kubenswrapper[4856]: I0320 13:49:24.256055 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:49:24 crc kubenswrapper[4856]: I0320 13:49:24.262022 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7c6b6b7976-vc6rm"] Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.007171 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.105242 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.190131 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdtsr\" (UniqueName: \"kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr\") pod \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.190241 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data\") pod \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.190321 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle\") pod \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\" (UID: \"7657bdc0-a1e5-4421-aceb-8cd410fc0226\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.195974 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr" (OuterVolumeSpecName: "kube-api-access-bdtsr") pod "7657bdc0-a1e5-4421-aceb-8cd410fc0226" (UID: "7657bdc0-a1e5-4421-aceb-8cd410fc0226"). InnerVolumeSpecName "kube-api-access-bdtsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.225400 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data" (OuterVolumeSpecName: "config-data") pod "7657bdc0-a1e5-4421-aceb-8cd410fc0226" (UID: "7657bdc0-a1e5-4421-aceb-8cd410fc0226"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.225573 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7657bdc0-a1e5-4421-aceb-8cd410fc0226" (UID: "7657bdc0-a1e5-4421-aceb-8cd410fc0226"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.237872 4856 generic.go:334] "Generic (PLEG): container finished" podID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.237936 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3","Type":"ContainerDied","Data":"ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.241080 4856 generic.go:334] "Generic (PLEG): container finished" podID="b314fa97-2e86-46ef-8034-97bb179a3139" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.241149 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b314fa97-2e86-46ef-8034-97bb179a3139","Type":"ContainerDied","Data":"6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.243490 4856 generic.go:334] "Generic (PLEG): container finished" podID="1f98c320-f318-443d-816d-f3dec9784023" containerID="0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.243548 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerDied","Data":"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.243576 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" event={"ID":"1f98c320-f318-443d-816d-f3dec9784023","Type":"ContainerDied","Data":"12d0d455725d7a6ac68ec89b7a2303532724ab31cb990ca5bcfab6c6d996734c"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.243593 4856 scope.go:117] "RemoveContainer" containerID="0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.243685 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-65d8844bc8-mjgnh" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.251187 4856 generic.go:334] "Generic (PLEG): container finished" podID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.251254 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7657bdc0-a1e5-4421-aceb-8cd410fc0226","Type":"ContainerDied","Data":"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.251335 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7657bdc0-a1e5-4421-aceb-8cd410fc0226","Type":"ContainerDied","Data":"49d0def3316bc781556684d21a232d2b5373089f3eec9874cb227de47773b82b"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.251414 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.261067 4856 generic.go:334] "Generic (PLEG): container finished" podID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerID="981b8b827dab20c49a4b95a3ff976be714d7cf0a7f3e2f70b0c810a4c1492d48" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.261319 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerDied","Data":"981b8b827dab20c49a4b95a3ff976be714d7cf0a7f3e2f70b0c810a4c1492d48"} Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.285363 4856 scope.go:117] "RemoveContainer" containerID="53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.289778 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.291736 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle\") pod \"1f98c320-f318-443d-816d-f3dec9784023\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.291801 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs\") pod \"1f98c320-f318-443d-816d-f3dec9784023\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.291857 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6bzt\" (UniqueName: \"kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt\") pod \"1f98c320-f318-443d-816d-f3dec9784023\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.291878 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom\") pod \"1f98c320-f318-443d-816d-f3dec9784023\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.291944 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data\") pod \"1f98c320-f318-443d-816d-f3dec9784023\" (UID: \"1f98c320-f318-443d-816d-f3dec9784023\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.292533 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.292552 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdtsr\" (UniqueName: \"kubernetes.io/projected/7657bdc0-a1e5-4421-aceb-8cd410fc0226-kube-api-access-bdtsr\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.292564 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7657bdc0-a1e5-4421-aceb-8cd410fc0226-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.292687 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs" (OuterVolumeSpecName: "logs") pod "1f98c320-f318-443d-816d-f3dec9784023" (UID: "1f98c320-f318-443d-816d-f3dec9784023"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.297301 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt" (OuterVolumeSpecName: "kube-api-access-l6bzt") pod "1f98c320-f318-443d-816d-f3dec9784023" (UID: "1f98c320-f318-443d-816d-f3dec9784023"). InnerVolumeSpecName "kube-api-access-l6bzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.299236 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1f98c320-f318-443d-816d-f3dec9784023" (UID: "1f98c320-f318-443d-816d-f3dec9784023"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.310875 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f98c320-f318-443d-816d-f3dec9784023" (UID: "1f98c320-f318-443d-816d-f3dec9784023"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.337182 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.342932 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data" (OuterVolumeSpecName: "config-data") pod "1f98c320-f318-443d-816d-f3dec9784023" (UID: "1f98c320-f318-443d-816d-f3dec9784023"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.351756 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.358142 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.368105 4856 scope.go:117] "RemoveContainer" containerID="0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804" Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.368678 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804\": container with ID starting with 0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804 not found: ID does not exist" containerID="0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.368708 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804"} err="failed to get container status \"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804\": rpc error: code = NotFound desc = could not find container \"0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804\": container with ID starting with 0fc7ec0518f746169369dade775c14254aff7c7882e1280cbad50ad393eda804 not found: ID does not exist" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.368744 4856 scope.go:117] "RemoveContainer" containerID="53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd" Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.373209 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd\": container with ID starting with 53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd not found: ID does not exist" containerID="53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.373298 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd"} err="failed to get container status \"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd\": rpc error: code = NotFound desc = could not find container \"53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd\": container with ID starting with 53b69402785210223a5a7302026b78d26ad7060949d599a917a00bb19d311ccd not found: ID does not exist" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.373332 4856 scope.go:117] "RemoveContainer" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.394857 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397182 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs\") pod \"ee995e44-3c2c-4ca3-9945-b9b757269749\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397239 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data\") pod \"b314fa97-2e86-46ef-8034-97bb179a3139\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397368 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle\") pod \"ee995e44-3c2c-4ca3-9945-b9b757269749\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397405 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data\") pod \"ee995e44-3c2c-4ca3-9945-b9b757269749\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397442 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hcsn\" (UniqueName: \"kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn\") pod \"ee995e44-3c2c-4ca3-9945-b9b757269749\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397465 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom\") pod \"ee995e44-3c2c-4ca3-9945-b9b757269749\" (UID: \"ee995e44-3c2c-4ca3-9945-b9b757269749\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397503 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvbk4\" (UniqueName: \"kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4\") pod \"b314fa97-2e86-46ef-8034-97bb179a3139\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397533 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle\") pod \"b314fa97-2e86-46ef-8034-97bb179a3139\" (UID: \"b314fa97-2e86-46ef-8034-97bb179a3139\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397654 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs" (OuterVolumeSpecName: "logs") pod "ee995e44-3c2c-4ca3-9945-b9b757269749" (UID: "ee995e44-3c2c-4ca3-9945-b9b757269749"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397865 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6bzt\" (UniqueName: \"kubernetes.io/projected/1f98c320-f318-443d-816d-f3dec9784023-kube-api-access-l6bzt\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397883 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397893 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397905 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee995e44-3c2c-4ca3-9945-b9b757269749-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397913 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f98c320-f318-443d-816d-f3dec9784023-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.397921 4856 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f98c320-f318-443d-816d-f3dec9784023-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.402017 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4" (OuterVolumeSpecName: "kube-api-access-rvbk4") pod "b314fa97-2e86-46ef-8034-97bb179a3139" (UID: "b314fa97-2e86-46ef-8034-97bb179a3139"). InnerVolumeSpecName "kube-api-access-rvbk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.402808 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn" (OuterVolumeSpecName: "kube-api-access-8hcsn") pod "ee995e44-3c2c-4ca3-9945-b9b757269749" (UID: "ee995e44-3c2c-4ca3-9945-b9b757269749"). InnerVolumeSpecName "kube-api-access-8hcsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.405003 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ee995e44-3c2c-4ca3-9945-b9b757269749" (UID: "ee995e44-3c2c-4ca3-9945-b9b757269749"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.405171 4856 scope.go:117] "RemoveContainer" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.411861 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd\": container with ID starting with 9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd not found: ID does not exist" containerID="9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.411904 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd"} err="failed to get container status \"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd\": rpc error: code = NotFound desc = could not find container \"9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd\": container with ID starting with 9965ddb48bc5126bef2cbbd5011ed469dbd6228b65936eecdace2c0dde235fcd not found: ID does not exist" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.425340 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee995e44-3c2c-4ca3-9945-b9b757269749" (UID: "ee995e44-3c2c-4ca3-9945-b9b757269749"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.432456 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b314fa97-2e86-46ef-8034-97bb179a3139" (UID: "b314fa97-2e86-46ef-8034-97bb179a3139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.449533 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data" (OuterVolumeSpecName: "config-data") pod "b314fa97-2e86-46ef-8034-97bb179a3139" (UID: "b314fa97-2e86-46ef-8034-97bb179a3139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.453876 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data" (OuterVolumeSpecName: "config-data") pod "ee995e44-3c2c-4ca3-9945-b9b757269749" (UID: "ee995e44-3c2c-4ca3-9945-b9b757269749"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.498577 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z56wq\" (UniqueName: \"kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq\") pod \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.498948 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data\") pod \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.498969 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle\") pod \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\" (UID: \"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3\") " Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499154 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499167 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499177 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hcsn\" (UniqueName: \"kubernetes.io/projected/ee995e44-3c2c-4ca3-9945-b9b757269749-kube-api-access-8hcsn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499186 4856 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee995e44-3c2c-4ca3-9945-b9b757269749-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499195 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvbk4\" (UniqueName: \"kubernetes.io/projected/b314fa97-2e86-46ef-8034-97bb179a3139-kube-api-access-rvbk4\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499204 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.499215 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b314fa97-2e86-46ef-8034-97bb179a3139-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.501647 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq" (OuterVolumeSpecName: "kube-api-access-z56wq") pod "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" (UID: "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3"). InnerVolumeSpecName "kube-api-access-z56wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.517116 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" (UID: "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.518148 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data" (OuterVolumeSpecName: "config-data") pod "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" (UID: "a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.518764 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.519157 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.519712 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.519751 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.519908 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.521020 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.522313 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:25 crc kubenswrapper[4856]: E0320 13:49:25.522389 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.576329 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.581616 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-65d8844bc8-mjgnh"] Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.600180 4856 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.600209 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.600220 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z56wq\" (UniqueName: \"kubernetes.io/projected/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3-kube-api-access-z56wq\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.828672 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ac0adc6-d09a-4367-838e-67f78ae5a050" path="/var/lib/kubelet/pods/1ac0adc6-d09a-4367-838e-67f78ae5a050/volumes" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.829368 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f98c320-f318-443d-816d-f3dec9784023" path="/var/lib/kubelet/pods/1f98c320-f318-443d-816d-f3dec9784023/volumes" Mar 20 13:49:25 crc kubenswrapper[4856]: I0320 13:49:25.829908 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" path="/var/lib/kubelet/pods/7657bdc0-a1e5-4421-aceb-8cd410fc0226/volumes" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.270103 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3","Type":"ContainerDied","Data":"024d926d8f991c82784d31c13f408ac76c78c0baadf9d7d4ad6f019bdc0747c0"} Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.270172 4856 scope.go:117] "RemoveContainer" containerID="ab03b487c11a612f57728e66da81986d055a18632e44fde76dd4ca7565653d6e" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.270175 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.273589 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b314fa97-2e86-46ef-8034-97bb179a3139","Type":"ContainerDied","Data":"4c9871dd58a8c80f84c25c7883deecb0e49b93852b1e1481ed3302180e7ffe73"} Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.273647 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.278895 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56d6d658ff-ch8jp" event={"ID":"ee995e44-3c2c-4ca3-9945-b9b757269749","Type":"ContainerDied","Data":"6138aca70e60643cb8d6b847c00370d88b03dfdea1ff22eb9ceff9d2c4e67ad7"} Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.278983 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56d6d658ff-ch8jp" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.290999 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.301209 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.304757 4856 scope.go:117] "RemoveContainer" containerID="6191819b7767bd877a4a309716b01ad39eb89ce4ab7c5255d8190b856875ae2f" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.312899 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.324030 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-56d6d658ff-ch8jp"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.333946 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.337393 4856 scope.go:117] "RemoveContainer" containerID="981b8b827dab20c49a4b95a3ff976be714d7cf0a7f3e2f70b0c810a4c1492d48" Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.342193 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:49:26 crc kubenswrapper[4856]: I0320 13:49:26.360101 4856 scope.go:117] "RemoveContainer" containerID="d930765c8702d78cbe6f1e7514f35eb8ca4969feb1ad881999f4f96ab179ba9c" Mar 20 13:49:27 crc kubenswrapper[4856]: I0320 13:49:27.832516 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" path="/var/lib/kubelet/pods/a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3/volumes" Mar 20 13:49:27 crc kubenswrapper[4856]: I0320 13:49:27.833691 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" path="/var/lib/kubelet/pods/b314fa97-2e86-46ef-8034-97bb179a3139/volumes" Mar 20 13:49:27 crc kubenswrapper[4856]: I0320 13:49:27.834783 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" path="/var/lib/kubelet/pods/ee995e44-3c2c-4ca3-9945-b9b757269749/volumes" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.025201 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.071966 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072051 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072074 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072096 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072121 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072151 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.072211 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd8l9\" (UniqueName: \"kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9\") pod \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\" (UID: \"4f4bbca3-e3dd-4be1-bf5b-43f88956883b\") " Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.103788 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9" (OuterVolumeSpecName: "kube-api-access-kd8l9") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "kube-api-access-kd8l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.103837 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.125874 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.136171 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.149525 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config" (OuterVolumeSpecName: "config") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.153192 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.170030 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4f4bbca3-e3dd-4be1-bf5b-43f88956883b" (UID: "4f4bbca3-e3dd-4be1-bf5b-43f88956883b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174035 4856 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174071 4856 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174087 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174101 4856 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174112 4856 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174124 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.174163 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd8l9\" (UniqueName: \"kubernetes.io/projected/4f4bbca3-e3dd-4be1-bf5b-43f88956883b-kube-api-access-kd8l9\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.332754 4856 generic.go:334] "Generic (PLEG): container finished" podID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerID="22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30" exitCode=0 Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.332807 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerDied","Data":"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30"} Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.332838 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-58f96446cc-blkvz" event={"ID":"4f4bbca3-e3dd-4be1-bf5b-43f88956883b","Type":"ContainerDied","Data":"17bd504c851f60275e8b0faecdf7f32f0ba8f22f56e42fd3fd249e92deef7e0d"} Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.332843 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-58f96446cc-blkvz" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.332859 4856 scope.go:117] "RemoveContainer" containerID="bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.365542 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.368455 4856 scope.go:117] "RemoveContainer" containerID="22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.370376 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-58f96446cc-blkvz"] Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.390977 4856 scope.go:117] "RemoveContainer" containerID="bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53" Mar 20 13:49:29 crc kubenswrapper[4856]: E0320 13:49:29.391500 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53\": container with ID starting with bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53 not found: ID does not exist" containerID="bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.391543 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53"} err="failed to get container status \"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53\": rpc error: code = NotFound desc = could not find container \"bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53\": container with ID starting with bc8374f964d989f4f1184fdafa53d97b150413ee3211ec5434335b9fdb7c0a53 not found: ID does not exist" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.391573 4856 scope.go:117] "RemoveContainer" containerID="22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30" Mar 20 13:49:29 crc kubenswrapper[4856]: E0320 13:49:29.392026 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30\": container with ID starting with 22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30 not found: ID does not exist" containerID="22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.392068 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30"} err="failed to get container status \"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30\": rpc error: code = NotFound desc = could not find container \"22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30\": container with ID starting with 22c34a902f254b2481f16cc18c96b0f8a9bea1f47a0f62781580a89b50217b30 not found: ID does not exist" Mar 20 13:49:29 crc kubenswrapper[4856]: I0320 13:49:29.831114 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" path="/var/lib/kubelet/pods/4f4bbca3-e3dd-4be1-bf5b-43f88956883b/volumes" Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.519543 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.519865 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.520156 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.520183 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.521879 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.523063 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.524480 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:30 crc kubenswrapper[4856]: E0320 13:49:30.524529 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.519557 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.521060 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.521451 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.521602 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.521661 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.523603 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.525567 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:35 crc kubenswrapper[4856]: E0320 13:49:35.525635 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.518894 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.520502 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.520628 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.521432 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.521611 4856 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.522761 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.526986 4856 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 20 13:49:40 crc kubenswrapper[4856]: E0320 13:49:40.527127 4856 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-qxlnx" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.220990 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxlnx_8e5225f1-7607-4e11-904f-0e40e483d384/ovs-vswitchd/0.log" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.222270 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267729 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267798 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267849 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267871 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7wrd\" (UniqueName: \"kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267889 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267880 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib" (OuterVolumeSpecName: "var-lib") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267916 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log\") pod \"8e5225f1-7607-4e11-904f-0e40e483d384\" (UID: \"8e5225f1-7607-4e11-904f-0e40e483d384\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267972 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.267962 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run" (OuterVolumeSpecName: "var-run") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.268067 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log" (OuterVolumeSpecName: "var-log") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.268345 4856 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.268358 4856 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.268366 4856 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.268374 4856 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8e5225f1-7607-4e11-904f-0e40e483d384-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.269234 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts" (OuterVolumeSpecName: "scripts") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.276042 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd" (OuterVolumeSpecName: "kube-api-access-z7wrd") pod "8e5225f1-7607-4e11-904f-0e40e483d384" (UID: "8e5225f1-7607-4e11-904f-0e40e483d384"). InnerVolumeSpecName "kube-api-access-z7wrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.369273 4856 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8e5225f1-7607-4e11-904f-0e40e483d384-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.369336 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7wrd\" (UniqueName: \"kubernetes.io/projected/8e5225f1-7607-4e11-904f-0e40e483d384-kube-api-access-z7wrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.504157 4856 generic.go:334] "Generic (PLEG): container finished" podID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerID="64b602f06351a958adc8f20a603944b2103c33e0a114f4cd698496a6b2cd9d5a" exitCode=137 Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.504242 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"64b602f06351a958adc8f20a603944b2103c33e0a114f4cd698496a6b2cd9d5a"} Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.507318 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qxlnx_8e5225f1-7607-4e11-904f-0e40e483d384/ovs-vswitchd/0.log" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.508509 4856 generic.go:334] "Generic (PLEG): container finished" podID="8e5225f1-7607-4e11-904f-0e40e483d384" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" exitCode=137 Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.508556 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerDied","Data":"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb"} Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.508595 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qxlnx" event={"ID":"8e5225f1-7607-4e11-904f-0e40e483d384","Type":"ContainerDied","Data":"f6bfdc601a8cd61414d901eb0119efb90ad558eb4c8feea245fd73aee433f4a1"} Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.508623 4856 scope.go:117] "RemoveContainer" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.508800 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qxlnx" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.538578 4856 scope.go:117] "RemoveContainer" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.559593 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.568089 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-qxlnx"] Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.571532 4856 scope.go:117] "RemoveContainer" containerID="7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.633362 4856 scope.go:117] "RemoveContainer" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" Mar 20 13:49:45 crc kubenswrapper[4856]: E0320 13:49:45.634561 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb\": container with ID starting with 1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb not found: ID does not exist" containerID="1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.634601 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb"} err="failed to get container status \"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb\": rpc error: code = NotFound desc = could not find container \"1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb\": container with ID starting with 1dd72344f7290ebf1792800b9b26bd62a7d07f31adc96d6476c379796b917bfb not found: ID does not exist" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.634627 4856 scope.go:117] "RemoveContainer" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" Mar 20 13:49:45 crc kubenswrapper[4856]: E0320 13:49:45.635147 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a\": container with ID starting with a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a not found: ID does not exist" containerID="a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.635174 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a"} err="failed to get container status \"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a\": rpc error: code = NotFound desc = could not find container \"a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a\": container with ID starting with a101d4a12439c182932ed86e3a7718880a57b72fe54733f25129ddf7a23abf7a not found: ID does not exist" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.635191 4856 scope.go:117] "RemoveContainer" containerID="7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c" Mar 20 13:49:45 crc kubenswrapper[4856]: E0320 13:49:45.635517 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c\": container with ID starting with 7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c not found: ID does not exist" containerID="7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.635619 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c"} err="failed to get container status \"7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c\": rpc error: code = NotFound desc = could not find container \"7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c\": container with ID starting with 7ff991bad92cf74626a4f3638a955362618c9540af3b11a6b6317a2184b53b3c not found: ID does not exist" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.830514 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" path="/var/lib/kubelet/pods/8e5225f1-7607-4e11-904f-0e40e483d384/volumes" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.849674 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876201 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876329 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876399 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876428 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2mll\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876483 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.876523 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache\") pod \"179c29eb-c606-4429-8bbd-f7a4f62790f9\" (UID: \"179c29eb-c606-4429-8bbd-f7a4f62790f9\") " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.877352 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock" (OuterVolumeSpecName: "lock") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.879591 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache" (OuterVolumeSpecName: "cache") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.882784 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.883115 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.884489 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll" (OuterVolumeSpecName: "kube-api-access-d2mll") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "kube-api-access-d2mll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.978529 4856 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.978566 4856 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.978578 4856 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.978590 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2mll\" (UniqueName: \"kubernetes.io/projected/179c29eb-c606-4429-8bbd-f7a4f62790f9-kube-api-access-d2mll\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.978602 4856 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/179c29eb-c606-4429-8bbd-f7a4f62790f9-cache\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:45 crc kubenswrapper[4856]: I0320 13:49:45.994989 4856 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.079740 4856 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.183221 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "179c29eb-c606-4429-8bbd-f7a4f62790f9" (UID: "179c29eb-c606-4429-8bbd-f7a4f62790f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.282804 4856 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/179c29eb-c606-4429-8bbd-f7a4f62790f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.523774 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"179c29eb-c606-4429-8bbd-f7a4f62790f9","Type":"ContainerDied","Data":"4575417590e14a7fb3606e98da48c8cff9594ce441e6131c18a30115f63ac022"} Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.523832 4856 scope.go:117] "RemoveContainer" containerID="64b602f06351a958adc8f20a603944b2103c33e0a114f4cd698496a6b2cd9d5a" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.524025 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.557535 4856 scope.go:117] "RemoveContainer" containerID="2bbe17d4032ceb3620e99676e538a3436047e64cd35c8e408a7e830c8d0c8916" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.567812 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.573225 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.584405 4856 scope.go:117] "RemoveContainer" containerID="7b0eeecc01033a001f3ec16d0f85af1f0a2b22608ba9b74a4124f80db63f7023" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.604940 4856 scope.go:117] "RemoveContainer" containerID="8dc8373e74ff37fef6751f9af5ae4fe48e9297688f83e021db44786f7698fae2" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.621236 4856 scope.go:117] "RemoveContainer" containerID="3ced5a863f2674ab088b2cfc34623e28ac2f1620c7f6f8dc4f2edb1bd867f7c6" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.636464 4856 scope.go:117] "RemoveContainer" containerID="7324448e1753fd76381dd12b6e7d9dc16d8ab4a8e4930a9aeb6e2de164019847" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.660504 4856 scope.go:117] "RemoveContainer" containerID="9c977a756055e2886ad5ca74cb43b1715a8e35dc20df5e2db03dadb213f99ae2" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.678222 4856 scope.go:117] "RemoveContainer" containerID="c25c88d12f3de7091d79c129a51dbb13814ceb3eb7e0f5f552600e9715f22cd3" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.693661 4856 scope.go:117] "RemoveContainer" containerID="66ddb19b6cb3cfb47423e82ae3b8ce578f9275b8ddaeaefca9cb0f6db6d03dd4" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.711230 4856 scope.go:117] "RemoveContainer" containerID="9708b2248759cc0b809d2397329f741db5de5c3791b0c0d67c59ef3236106ae9" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.729723 4856 scope.go:117] "RemoveContainer" containerID="336bef9fbe708a542dba755da9664d99b5431f33ebb505b3d410fc05b0726883" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.789387 4856 scope.go:117] "RemoveContainer" containerID="bf17e5b77e3a7a1bdf3ad21621b736ab3a7b00f1bfc4f9f9e63067c32d46e273" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.820468 4856 scope.go:117] "RemoveContainer" containerID="7fec7d9d05c7e6a547275f47329f4ae8d58fe68cc7259691ff5842999eacf987" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.845715 4856 scope.go:117] "RemoveContainer" containerID="103cdf38f9ed46d33180e33da2171b27c42859dedc97518509e187489741b123" Mar 20 13:49:46 crc kubenswrapper[4856]: I0320 13:49:46.869176 4856 scope.go:117] "RemoveContainer" containerID="99aed41847b9a822596138e8aef2e4873222dfb5643d9cd387d54e1029fa26ae" Mar 20 13:49:47 crc kubenswrapper[4856]: I0320 13:49:47.831035 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" path="/var/lib/kubelet/pods/179c29eb-c606-4429-8bbd-f7a4f62790f9/volumes" Mar 20 13:49:49 crc kubenswrapper[4856]: I0320 13:49:49.962214 4856 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podef92f699-7db4-4425-949a-693de8e803a3"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podef92f699-7db4-4425-949a-693de8e803a3] : Timed out while waiting for systemd to remove kubepods-besteffort-podef92f699_7db4_4425_949a_693de8e803a3.slice" Mar 20 13:49:49 crc kubenswrapper[4856]: E0320 13:49:49.962677 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podef92f699-7db4-4425-949a-693de8e803a3] : unable to destroy cgroup paths for cgroup [kubepods besteffort podef92f699-7db4-4425-949a-693de8e803a3] : Timed out while waiting for systemd to remove kubepods-besteffort-podef92f699_7db4_4425_949a_693de8e803a3.slice" pod="openstack/nova-cell1-9268-account-create-update-lbntw" podUID="ef92f699-7db4-4425-949a-693de8e803a3" Mar 20 13:49:49 crc kubenswrapper[4856]: I0320 13:49:49.972359 4856 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podc3a2b3f2-ab65-4013-9aa8-66d38474c2ab"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podc3a2b3f2-ab65-4013-9aa8-66d38474c2ab] : Timed out while waiting for systemd to remove kubepods-besteffort-podc3a2b3f2_ab65_4013_9aa8_66d38474c2ab.slice" Mar 20 13:49:50 crc kubenswrapper[4856]: I0320 13:49:50.574401 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9268-account-create-update-lbntw" Mar 20 13:49:50 crc kubenswrapper[4856]: I0320 13:49:50.644949 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:50 crc kubenswrapper[4856]: I0320 13:49:50.650972 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9268-account-create-update-lbntw"] Mar 20 13:49:51 crc kubenswrapper[4856]: I0320 13:49:51.832785 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef92f699-7db4-4425-949a-693de8e803a3" path="/var/lib/kubelet/pods/ef92f699-7db4-4425-949a-693de8e803a3/volumes" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143051 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-b9x2v"] Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.143905 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143919 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.143927 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143934 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.143947 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="setup-container" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143953 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="setup-container" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.143960 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143967 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.143977 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.143985 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144001 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144008 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144015 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144023 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144031 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="sg-core" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144039 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="sg-core" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144051 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" containerName="kube-state-metrics" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144060 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" containerName="kube-state-metrics" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144073 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144080 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144088 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="proxy-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144095 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="proxy-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144108 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144116 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144131 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144138 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144146 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144153 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-server" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144161 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="cinder-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144168 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="cinder-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144181 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144188 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144197 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="rabbitmq" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144204 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="rabbitmq" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144217 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ac0adc6-d09a-4367-838e-67f78ae5a050" containerName="keystone-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144224 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ac0adc6-d09a-4367-838e-67f78ae5a050" containerName="keystone-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144238 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144245 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144257 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144264 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144291 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-reaper" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144297 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-reaper" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144308 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144316 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144324 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="swift-recon-cron" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144331 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="swift-recon-cron" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144340 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144347 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144358 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144365 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-server" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144374 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144380 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144392 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144400 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144408 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144416 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144424 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="probe" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144431 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="probe" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144441 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144448 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144457 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-notification-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144465 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-notification-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144477 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144485 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144496 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144504 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-server" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144515 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144523 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144532 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144539 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144548 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628ad6bb-ab51-4021-9757-4247a1ccfa71" containerName="memcached" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144554 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="628ad6bb-ab51-4021-9757-4247a1ccfa71" containerName="memcached" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144566 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144573 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144581 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144590 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144603 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144610 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144620 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="openstack-network-exporter" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144628 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="openstack-network-exporter" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144635 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144642 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144652 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-expirer" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144659 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-expirer" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144669 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="mysql-bootstrap" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144676 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="mysql-bootstrap" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144687 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-metadata" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144695 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-metadata" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144707 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-central-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144714 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-central-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144723 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144732 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144740 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="rsync" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144747 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="rsync" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144754 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerName="nova-cell1-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144762 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerName="nova-cell1-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144770 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144778 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144790 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144796 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-api" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144807 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144814 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144824 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144831 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144842 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144849 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144859 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="ovn-northd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144866 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="ovn-northd" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144877 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server-init" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144884 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server-init" Mar 20 13:50:00 crc kubenswrapper[4856]: E0320 13:50:00.144894 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="galera" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.144900 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="galera" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145083 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145097 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="rsync" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145112 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ac0adc6-d09a-4367-838e-67f78ae5a050" containerName="keystone-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145123 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145133 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145145 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="proxy-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145156 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="ovn-northd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145163 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145173 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="467cf6ce-9c87-45d6-9968-4d5372f70cb3" containerName="nova-metadata-metadata" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145183 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4397f29e-c0c9-4726-8fb4-1afe1441ec83" containerName="galera" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145190 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145200 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145208 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70b9b91-b663-40a8-a2a8-f1f57fc17bab" containerName="placement-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145216 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="cinder-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145224 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="628ad6bb-ab51-4021-9757-4247a1ccfa71" containerName="memcached" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145233 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-central-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145242 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145252 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145258 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="82ca80b6-bf8a-4741-a5e0-059f20fae69b" containerName="kube-state-metrics" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145283 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee995e44-3c2c-4ca3-9945-b9b757269749" containerName="barbican-worker-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145290 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovs-vswitchd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145298 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145305 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f4bbca3-e3dd-4be1-bf5b-43f88956883b" containerName="neutron-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145313 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145318 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="ceilometer-notification-agent" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145327 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145333 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-expirer" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145340 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e5225f1-7607-4e11-904f-0e40e483d384" containerName="ovsdb-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145347 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145355 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f98c320-f318-443d-816d-f3dec9784023" containerName="barbican-keystone-listener" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145361 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2bd8e2-7f52-4c35-ac1d-f1175581a751" containerName="rabbitmq" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145368 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145375 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="swift-recon-cron" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145384 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145391 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145400 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-httpd" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145405 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cf0465b-c48d-4c35-8e65-3f82c517ad98" containerName="nova-api-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145414 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145421 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b314fa97-2e86-46ef-8034-97bb179a3139" containerName="nova-cell0-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145427 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="7657bdc0-a1e5-4421-aceb-8cd410fc0226" containerName="nova-cell1-conductor-conductor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145435 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8748e306-2876-434d-abef-f7d9cd7c7a07" containerName="probe" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145441 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="container-updater" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145448 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65360e6-90a7-4a8e-8647-6239e7c52e5b" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145455 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fc74c5-121e-4ac1-8d50-8be3393d080a" containerName="openstack-network-exporter" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145462 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="object-auditor" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145470 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a53fecc-3af1-4ced-acd9-198296d50771" containerName="cinder-api" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145479 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="9010796b-5362-4885-8a2c-19668efe6e25" containerName="glance-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145485 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="050eefc7-c113-4198-b1ad-0645ad765a2a" containerName="sg-core" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145515 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-replicator" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145522 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5e6ebd4-fc42-4a07-9f72-572e9a48b4d3" containerName="nova-scheduler-scheduler" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145531 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-reaper" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145544 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0be3924-19c6-4eee-bc60-7fbe28336b67" containerName="barbican-api-log" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.145552 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="179c29eb-c606-4429-8bbd-f7a4f62790f9" containerName="account-server" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.146006 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.148094 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.148509 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.148671 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.150063 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-b9x2v"] Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.199329 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7w6\" (UniqueName: \"kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6\") pod \"auto-csr-approver-29566910-b9x2v\" (UID: \"60869252-c55a-4e0a-a06d-9d57b6539209\") " pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.301138 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7w6\" (UniqueName: \"kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6\") pod \"auto-csr-approver-29566910-b9x2v\" (UID: \"60869252-c55a-4e0a-a06d-9d57b6539209\") " pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.322763 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7w6\" (UniqueName: \"kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6\") pod \"auto-csr-approver-29566910-b9x2v\" (UID: \"60869252-c55a-4e0a-a06d-9d57b6539209\") " pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.471184 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:00 crc kubenswrapper[4856]: I0320 13:50:00.927669 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-b9x2v"] Mar 20 13:50:00 crc kubenswrapper[4856]: W0320 13:50:00.940600 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60869252_c55a_4e0a_a06d_9d57b6539209.slice/crio-524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa WatchSource:0}: Error finding container 524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa: Status 404 returned error can't find the container with id 524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa Mar 20 13:50:01 crc kubenswrapper[4856]: I0320 13:50:01.694935 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" event={"ID":"60869252-c55a-4e0a-a06d-9d57b6539209","Type":"ContainerStarted","Data":"524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa"} Mar 20 13:50:02 crc kubenswrapper[4856]: I0320 13:50:02.711931 4856 generic.go:334] "Generic (PLEG): container finished" podID="60869252-c55a-4e0a-a06d-9d57b6539209" containerID="a9476a66d4cd3f2685d44b05c73737f808a2bf651148a8eb37f7976db65af7fb" exitCode=0 Mar 20 13:50:02 crc kubenswrapper[4856]: I0320 13:50:02.712011 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" event={"ID":"60869252-c55a-4e0a-a06d-9d57b6539209","Type":"ContainerDied","Data":"a9476a66d4cd3f2685d44b05c73737f808a2bf651148a8eb37f7976db65af7fb"} Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.017618 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.053392 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5z7w6\" (UniqueName: \"kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6\") pod \"60869252-c55a-4e0a-a06d-9d57b6539209\" (UID: \"60869252-c55a-4e0a-a06d-9d57b6539209\") " Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.058627 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6" (OuterVolumeSpecName: "kube-api-access-5z7w6") pod "60869252-c55a-4e0a-a06d-9d57b6539209" (UID: "60869252-c55a-4e0a-a06d-9d57b6539209"). InnerVolumeSpecName "kube-api-access-5z7w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.155722 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5z7w6\" (UniqueName: \"kubernetes.io/projected/60869252-c55a-4e0a-a06d-9d57b6539209-kube-api-access-5z7w6\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.732905 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" event={"ID":"60869252-c55a-4e0a-a06d-9d57b6539209","Type":"ContainerDied","Data":"524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa"} Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.732949 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="524625da3f7c858cbe75c5e8eea6e815f0cf987e73cb2217f8a705a4083e97aa" Mar 20 13:50:04 crc kubenswrapper[4856]: I0320 13:50:04.733237 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-b9x2v" Mar 20 13:50:05 crc kubenswrapper[4856]: I0320 13:50:05.082993 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-nhn9w"] Mar 20 13:50:05 crc kubenswrapper[4856]: I0320 13:50:05.088202 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-nhn9w"] Mar 20 13:50:05 crc kubenswrapper[4856]: I0320 13:50:05.829006 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e26cfc8-a8ce-4db8-bc87-f57d1d209731" path="/var/lib/kubelet/pods/3e26cfc8-a8ce-4db8-bc87-f57d1d209731/volumes" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.067928 4856 scope.go:117] "RemoveContainer" containerID="1301da16b3fd93ee4281a8dd4293e0bcc97da5a9551305c02170570ec4f1d44a" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.096338 4856 scope.go:117] "RemoveContainer" containerID="d9659d1018001e10fbb34682f5c5367a558d71cd9ab058ad5de9f78c74b341b6" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.119748 4856 scope.go:117] "RemoveContainer" containerID="fdbae009f5ef9dcbbfa96f14f7c8dd2d9accee678a6a94ed47f4995dd338234c" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.142162 4856 scope.go:117] "RemoveContainer" containerID="e508e1652b92ae1e87caf70b07b8b6ca94d53d51788493cfc4280c00f1d42a1f" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.194665 4856 scope.go:117] "RemoveContainer" containerID="e44988e1ef016b282859d3b5848743fd242e2ba8424bb50bbbf315c770a62806" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.211689 4856 scope.go:117] "RemoveContainer" containerID="942d84a14a4be4dfa3b0b57021f39fb11626ccf0b4b7bb7010b8dc07df4835fb" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.233201 4856 scope.go:117] "RemoveContainer" containerID="e74e7181e50b8482827268fba5857dee7206b3d88d8299a92ee57e8febb4c398" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.252242 4856 scope.go:117] "RemoveContainer" containerID="11061327d008cbbe148b32ba2127eb9d46fde32a3092f69a96fe49fa415abcd1" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.267822 4856 scope.go:117] "RemoveContainer" containerID="37fa27fb17bdccebcd671a7ee5c397835628b1892f2a54a0c7afe739f659be86" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.289964 4856 scope.go:117] "RemoveContainer" containerID="00a8dfecbc0a5d7ce0a19c67a349d5cccda83e9e9fee4ec3f2df1ac74770930a" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.328229 4856 scope.go:117] "RemoveContainer" containerID="e0baf76a23d10f6480a418e705166277839f259f203f9d0a350f3e3805d573a6" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.349208 4856 scope.go:117] "RemoveContainer" containerID="36fcff4fedd17a8db136e70258f4b74c385ee450d44580a92885487081aa1af7" Mar 20 13:50:31 crc kubenswrapper[4856]: I0320 13:50:31.375912 4856 scope.go:117] "RemoveContainer" containerID="c1855580ac0c21798b3e4dc54c1cbf54ac5da36e0b5f20f530bd19ccb88ac3c7" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.533096 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:50:48 crc kubenswrapper[4856]: E0320 13:50:48.533904 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60869252-c55a-4e0a-a06d-9d57b6539209" containerName="oc" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.533922 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="60869252-c55a-4e0a-a06d-9d57b6539209" containerName="oc" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.534084 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="60869252-c55a-4e0a-a06d-9d57b6539209" containerName="oc" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.535653 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.553896 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.684629 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.684670 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.684697 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8nsc\" (UniqueName: \"kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.785946 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.786006 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.786035 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8nsc\" (UniqueName: \"kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.786557 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.786649 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.807206 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8nsc\" (UniqueName: \"kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc\") pod \"redhat-marketplace-nm289\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:48 crc kubenswrapper[4856]: I0320 13:50:48.868477 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:49 crc kubenswrapper[4856]: I0320 13:50:49.303539 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:50:49 crc kubenswrapper[4856]: W0320 13:50:49.305978 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffcf6a02_a9cc_4d8d_9810_c3e715c5832a.slice/crio-07ad4547e2f638f0343afa11869a949330ebf2e7639d977bfe0b2f878ea99562 WatchSource:0}: Error finding container 07ad4547e2f638f0343afa11869a949330ebf2e7639d977bfe0b2f878ea99562: Status 404 returned error can't find the container with id 07ad4547e2f638f0343afa11869a949330ebf2e7639d977bfe0b2f878ea99562 Mar 20 13:50:50 crc kubenswrapper[4856]: I0320 13:50:50.193022 4856 generic.go:334] "Generic (PLEG): container finished" podID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerID="c148250a5f2098cc34b14b21c0636823363a5aa6d12d421fc004db9d58af155b" exitCode=0 Mar 20 13:50:50 crc kubenswrapper[4856]: I0320 13:50:50.193094 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerDied","Data":"c148250a5f2098cc34b14b21c0636823363a5aa6d12d421fc004db9d58af155b"} Mar 20 13:50:50 crc kubenswrapper[4856]: I0320 13:50:50.193693 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerStarted","Data":"07ad4547e2f638f0343afa11869a949330ebf2e7639d977bfe0b2f878ea99562"} Mar 20 13:50:52 crc kubenswrapper[4856]: I0320 13:50:52.210711 4856 generic.go:334] "Generic (PLEG): container finished" podID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerID="d8723c9a7203f9d622c68657847e4901d7a5413ec5d560de899a24ff9b107ce6" exitCode=0 Mar 20 13:50:52 crc kubenswrapper[4856]: I0320 13:50:52.210833 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerDied","Data":"d8723c9a7203f9d622c68657847e4901d7a5413ec5d560de899a24ff9b107ce6"} Mar 20 13:50:53 crc kubenswrapper[4856]: I0320 13:50:53.219832 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerStarted","Data":"0bdb62d2cb434cdf910ced81dd41395a4bfa829b3dd534974cb029abd6f01210"} Mar 20 13:50:58 crc kubenswrapper[4856]: I0320 13:50:58.869636 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:58 crc kubenswrapper[4856]: I0320 13:50:58.870537 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:58 crc kubenswrapper[4856]: I0320 13:50:58.928020 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:58 crc kubenswrapper[4856]: I0320 13:50:58.960166 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nm289" podStartSLOduration=8.386393465 podStartE2EDuration="10.960147575s" podCreationTimestamp="2026-03-20 13:50:48 +0000 UTC" firstStartedPulling="2026-03-20 13:50:50.194728627 +0000 UTC m=+1665.075754757" lastFinishedPulling="2026-03-20 13:50:52.768482727 +0000 UTC m=+1667.649508867" observedRunningTime="2026-03-20 13:50:53.242625493 +0000 UTC m=+1668.123651633" watchObservedRunningTime="2026-03-20 13:50:58.960147575 +0000 UTC m=+1673.841173715" Mar 20 13:50:59 crc kubenswrapper[4856]: I0320 13:50:59.321323 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:50:59 crc kubenswrapper[4856]: I0320 13:50:59.393161 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:51:01 crc kubenswrapper[4856]: I0320 13:51:01.294080 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nm289" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="registry-server" containerID="cri-o://0bdb62d2cb434cdf910ced81dd41395a4bfa829b3dd534974cb029abd6f01210" gracePeriod=2 Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.307035 4856 generic.go:334] "Generic (PLEG): container finished" podID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerID="0bdb62d2cb434cdf910ced81dd41395a4bfa829b3dd534974cb029abd6f01210" exitCode=0 Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.307072 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerDied","Data":"0bdb62d2cb434cdf910ced81dd41395a4bfa829b3dd534974cb029abd6f01210"} Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.681510 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.823830 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities\") pod \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.823925 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content\") pod \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.823966 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8nsc\" (UniqueName: \"kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc\") pod \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\" (UID: \"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a\") " Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.824975 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities" (OuterVolumeSpecName: "utilities") pod "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" (UID: "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.831480 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc" (OuterVolumeSpecName: "kube-api-access-b8nsc") pod "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" (UID: "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a"). InnerVolumeSpecName "kube-api-access-b8nsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.867291 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" (UID: "ffcf6a02-a9cc-4d8d-9810-c3e715c5832a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.926367 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.926425 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:02 crc kubenswrapper[4856]: I0320 13:51:02.926435 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8nsc\" (UniqueName: \"kubernetes.io/projected/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a-kube-api-access-b8nsc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.321770 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nm289" event={"ID":"ffcf6a02-a9cc-4d8d-9810-c3e715c5832a","Type":"ContainerDied","Data":"07ad4547e2f638f0343afa11869a949330ebf2e7639d977bfe0b2f878ea99562"} Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.321844 4856 scope.go:117] "RemoveContainer" containerID="0bdb62d2cb434cdf910ced81dd41395a4bfa829b3dd534974cb029abd6f01210" Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.321891 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nm289" Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.356444 4856 scope.go:117] "RemoveContainer" containerID="d8723c9a7203f9d622c68657847e4901d7a5413ec5d560de899a24ff9b107ce6" Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.389865 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.398348 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nm289"] Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.404952 4856 scope.go:117] "RemoveContainer" containerID="c148250a5f2098cc34b14b21c0636823363a5aa6d12d421fc004db9d58af155b" Mar 20 13:51:03 crc kubenswrapper[4856]: I0320 13:51:03.841681 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" path="/var/lib/kubelet/pods/ffcf6a02-a9cc-4d8d-9810-c3e715c5832a/volumes" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.580923 4856 scope.go:117] "RemoveContainer" containerID="064a0bd3d163eca864105c246a50becc9a7b5a97a2cb898d52c76240f0bad351" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.621150 4856 scope.go:117] "RemoveContainer" containerID="f708b70995f625b5570b66b4c5d623f049c5b186f6b7a641284f83c29309e266" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.668548 4856 scope.go:117] "RemoveContainer" containerID="4fe885cdd4aa1c3051a53c27c91141ee996859bc09dc6cddf5ba8e626d811008" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.707041 4856 scope.go:117] "RemoveContainer" containerID="84d846a4b23272139a017e7b80a305138eb8dcb2d4719bcd2bb1f9b05fa1a53d" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.763027 4856 scope.go:117] "RemoveContainer" containerID="5d5613f916e03b4314b4133cf18da4fd036e61c334a7e2755e150637c92b7cd9" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.820891 4856 scope.go:117] "RemoveContainer" containerID="f6981b172eccdb9553125740236321d68c8c0731d63afc46cf2040e981ea85b8" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.884351 4856 scope.go:117] "RemoveContainer" containerID="2a042cc4e28d2645cd41e9cac3dae6d1aedb80c69cbc119c7f670a0e6081ff5a" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.908201 4856 scope.go:117] "RemoveContainer" containerID="c37597c183ad2be22cd4505258c442e053d875efb09f132479a632f17125202b" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.935308 4856 scope.go:117] "RemoveContainer" containerID="929bd48ca16b4b45e1ae843fe0a9e263670cc6925e68d991b3b7e682231f8a91" Mar 20 13:51:31 crc kubenswrapper[4856]: I0320 13:51:31.962406 4856 scope.go:117] "RemoveContainer" containerID="a366776a203a7fa9d5f08eb4671d855fe2eeb0585c1540357869f4722b8099e0" Mar 20 13:51:32 crc kubenswrapper[4856]: I0320 13:51:32.002389 4856 scope.go:117] "RemoveContainer" containerID="02776c7a86559edcb0633594c636cd3cd3d9717439545e3e679b74290bdecf55" Mar 20 13:51:32 crc kubenswrapper[4856]: I0320 13:51:32.047122 4856 scope.go:117] "RemoveContainer" containerID="70a9aeecd1a7c315226a9a8375517e93271afd2ec56f559f08f46435a6345ac8" Mar 20 13:51:32 crc kubenswrapper[4856]: I0320 13:51:32.073461 4856 scope.go:117] "RemoveContainer" containerID="e665ed6ce24fbf7ef1d1d05407ed507322019dcf1a1f8c42525de5d8898eb0e4" Mar 20 13:51:39 crc kubenswrapper[4856]: I0320 13:51:39.987380 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:51:39 crc kubenswrapper[4856]: I0320 13:51:39.987686 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.150680 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-99ctf"] Mar 20 13:52:00 crc kubenswrapper[4856]: E0320 13:52:00.153305 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.153339 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4856]: E0320 13:52:00.153363 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="extract-content" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.153371 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="extract-content" Mar 20 13:52:00 crc kubenswrapper[4856]: E0320 13:52:00.153381 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="extract-utilities" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.153391 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="extract-utilities" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.153582 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffcf6a02-a9cc-4d8d-9810-c3e715c5832a" containerName="registry-server" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.154205 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.155959 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.156220 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.156563 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.164730 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-99ctf"] Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.306204 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d22gn\" (UniqueName: \"kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn\") pod \"auto-csr-approver-29566912-99ctf\" (UID: \"ef24db81-268d-49ad-92b6-79c79398fb22\") " pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.407444 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d22gn\" (UniqueName: \"kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn\") pod \"auto-csr-approver-29566912-99ctf\" (UID: \"ef24db81-268d-49ad-92b6-79c79398fb22\") " pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.431492 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d22gn\" (UniqueName: \"kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn\") pod \"auto-csr-approver-29566912-99ctf\" (UID: \"ef24db81-268d-49ad-92b6-79c79398fb22\") " pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.475862 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.711482 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-99ctf"] Mar 20 13:52:00 crc kubenswrapper[4856]: W0320 13:52:00.718504 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef24db81_268d_49ad_92b6_79c79398fb22.slice/crio-f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39 WatchSource:0}: Error finding container f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39: Status 404 returned error can't find the container with id f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39 Mar 20 13:52:00 crc kubenswrapper[4856]: I0320 13:52:00.872622 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-99ctf" event={"ID":"ef24db81-268d-49ad-92b6-79c79398fb22","Type":"ContainerStarted","Data":"f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39"} Mar 20 13:52:01 crc kubenswrapper[4856]: I0320 13:52:01.880538 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-99ctf" event={"ID":"ef24db81-268d-49ad-92b6-79c79398fb22","Type":"ContainerStarted","Data":"0446fc63578af9c9d70788f4526df6ed94d8ad6ca8b4cb119178c67860206c97"} Mar 20 13:52:01 crc kubenswrapper[4856]: I0320 13:52:01.899395 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566912-99ctf" podStartSLOduration=1.043202364 podStartE2EDuration="1.899381078s" podCreationTimestamp="2026-03-20 13:52:00 +0000 UTC" firstStartedPulling="2026-03-20 13:52:00.720984906 +0000 UTC m=+1735.602011056" lastFinishedPulling="2026-03-20 13:52:01.5771636 +0000 UTC m=+1736.458189770" observedRunningTime="2026-03-20 13:52:01.895382598 +0000 UTC m=+1736.776408728" watchObservedRunningTime="2026-03-20 13:52:01.899381078 +0000 UTC m=+1736.780407208" Mar 20 13:52:02 crc kubenswrapper[4856]: I0320 13:52:02.891300 4856 generic.go:334] "Generic (PLEG): container finished" podID="ef24db81-268d-49ad-92b6-79c79398fb22" containerID="0446fc63578af9c9d70788f4526df6ed94d8ad6ca8b4cb119178c67860206c97" exitCode=0 Mar 20 13:52:02 crc kubenswrapper[4856]: I0320 13:52:02.892570 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-99ctf" event={"ID":"ef24db81-268d-49ad-92b6-79c79398fb22","Type":"ContainerDied","Data":"0446fc63578af9c9d70788f4526df6ed94d8ad6ca8b4cb119178c67860206c97"} Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.216382 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.363905 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d22gn\" (UniqueName: \"kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn\") pod \"ef24db81-268d-49ad-92b6-79c79398fb22\" (UID: \"ef24db81-268d-49ad-92b6-79c79398fb22\") " Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.372426 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn" (OuterVolumeSpecName: "kube-api-access-d22gn") pod "ef24db81-268d-49ad-92b6-79c79398fb22" (UID: "ef24db81-268d-49ad-92b6-79c79398fb22"). InnerVolumeSpecName "kube-api-access-d22gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.467345 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d22gn\" (UniqueName: \"kubernetes.io/projected/ef24db81-268d-49ad-92b6-79c79398fb22-kube-api-access-d22gn\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.910244 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-99ctf" event={"ID":"ef24db81-268d-49ad-92b6-79c79398fb22","Type":"ContainerDied","Data":"f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39"} Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.910363 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4ddeabdcee864b96b95d5c6e7c4dde7eada2ab3c6fec0304aabed1e41598f39" Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.910400 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-99ctf" Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.978607 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-llv2r"] Mar 20 13:52:04 crc kubenswrapper[4856]: I0320 13:52:04.983869 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-llv2r"] Mar 20 13:52:05 crc kubenswrapper[4856]: I0320 13:52:05.838011 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c376281a-ae9c-4057-a9ac-1ef731747830" path="/var/lib/kubelet/pods/c376281a-ae9c-4057-a9ac-1ef731747830/volumes" Mar 20 13:52:09 crc kubenswrapper[4856]: I0320 13:52:09.988046 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:09 crc kubenswrapper[4856]: I0320 13:52:09.988664 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:32 crc kubenswrapper[4856]: I0320 13:52:32.310847 4856 scope.go:117] "RemoveContainer" containerID="749708f3aae905141646bbe16b71edc1c0c7123765a07503a72806e70934b3c9" Mar 20 13:52:32 crc kubenswrapper[4856]: I0320 13:52:32.365184 4856 scope.go:117] "RemoveContainer" containerID="727b99f476c9dba3f82144f27425c67b34ca4340ff2eb5563dea3b3ca28872a6" Mar 20 13:52:32 crc kubenswrapper[4856]: I0320 13:52:32.397890 4856 scope.go:117] "RemoveContainer" containerID="ac2cf0183696c7efdb1e246690f9b18fbccf038a604105e4af9135030cb8a10c" Mar 20 13:52:39 crc kubenswrapper[4856]: I0320 13:52:39.987840 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:39 crc kubenswrapper[4856]: I0320 13:52:39.988097 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:39 crc kubenswrapper[4856]: I0320 13:52:39.988146 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 13:52:39 crc kubenswrapper[4856]: I0320 13:52:39.988789 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:52:39 crc kubenswrapper[4856]: I0320 13:52:39.988836 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" gracePeriod=600 Mar 20 13:52:40 crc kubenswrapper[4856]: E0320 13:52:40.125448 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:52:40 crc kubenswrapper[4856]: I0320 13:52:40.263209 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" exitCode=0 Mar 20 13:52:40 crc kubenswrapper[4856]: I0320 13:52:40.263249 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e"} Mar 20 13:52:40 crc kubenswrapper[4856]: I0320 13:52:40.263297 4856 scope.go:117] "RemoveContainer" containerID="8ff0188671a4ceb2d7b4a21321cd19c62708c14b4473d8f33ecd5827e498b585" Mar 20 13:52:40 crc kubenswrapper[4856]: I0320 13:52:40.264258 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:52:40 crc kubenswrapper[4856]: E0320 13:52:40.264530 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:52:52 crc kubenswrapper[4856]: I0320 13:52:52.820226 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:52:52 crc kubenswrapper[4856]: E0320 13:52:52.821207 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:53:04 crc kubenswrapper[4856]: I0320 13:53:04.820202 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:53:04 crc kubenswrapper[4856]: E0320 13:53:04.821109 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:53:16 crc kubenswrapper[4856]: I0320 13:53:16.820137 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:53:16 crc kubenswrapper[4856]: E0320 13:53:16.821310 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:53:31 crc kubenswrapper[4856]: I0320 13:53:31.820698 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:53:31 crc kubenswrapper[4856]: E0320 13:53:31.821587 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.476157 4856 scope.go:117] "RemoveContainer" containerID="47598b985bcb2355ceb6386186e9855bab5c284b8b5f4f8be5e0ef8201c62f0a" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.514251 4856 scope.go:117] "RemoveContainer" containerID="ea3e57786dd5d593272c7769d78753522ecbf8651c0d96f5c6236f1c41bd02ec" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.561109 4856 scope.go:117] "RemoveContainer" containerID="c911729d76155581835fd96d5ba0fd73a56408242fdd2ebb71949a54c4f368f8" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.595444 4856 scope.go:117] "RemoveContainer" containerID="7c030a52fa49c1ad27d857f75abf652057f78f1ef66883ba32b85866d7b6baf7" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.639657 4856 scope.go:117] "RemoveContainer" containerID="77e3580170a69870b08acfefe46b48cd21070ff07cffab50cec1275d006beea4" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.675583 4856 scope.go:117] "RemoveContainer" containerID="44ee3199e8ef2be02919b100ea3ac3eb65af208ff9499ecb07ccb5204ef2e122" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.703195 4856 scope.go:117] "RemoveContainer" containerID="65ed668c045bd2fb920f6ef6b0e53b5aa27ea56d88a168862d0d47b73502ce27" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.720762 4856 scope.go:117] "RemoveContainer" containerID="f4db134aa1fb1676999455edc540061e8b9ce9c16ee9acfc003c7b44c61b255c" Mar 20 13:53:32 crc kubenswrapper[4856]: I0320 13:53:32.742561 4856 scope.go:117] "RemoveContainer" containerID="19e3e1f0478c0cdbf16f45c422cf145e176facf22ca8b861bee509403ecaf61f" Mar 20 13:53:44 crc kubenswrapper[4856]: I0320 13:53:44.819930 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:53:44 crc kubenswrapper[4856]: E0320 13:53:44.821023 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:53:58 crc kubenswrapper[4856]: I0320 13:53:58.820852 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:53:58 crc kubenswrapper[4856]: E0320 13:53:58.821909 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.182939 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-6z2wg"] Mar 20 13:54:00 crc kubenswrapper[4856]: E0320 13:54:00.184364 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef24db81-268d-49ad-92b6-79c79398fb22" containerName="oc" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.184503 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef24db81-268d-49ad-92b6-79c79398fb22" containerName="oc" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.184846 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef24db81-268d-49ad-92b6-79c79398fb22" containerName="oc" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.185696 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.188735 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.188983 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.189132 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.204188 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-6z2wg"] Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.304976 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts5hr\" (UniqueName: \"kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr\") pod \"auto-csr-approver-29566914-6z2wg\" (UID: \"b5c2fe13-4636-47e7-a767-5756313be1e2\") " pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.406812 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts5hr\" (UniqueName: \"kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr\") pod \"auto-csr-approver-29566914-6z2wg\" (UID: \"b5c2fe13-4636-47e7-a767-5756313be1e2\") " pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.428025 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts5hr\" (UniqueName: \"kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr\") pod \"auto-csr-approver-29566914-6z2wg\" (UID: \"b5c2fe13-4636-47e7-a767-5756313be1e2\") " pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.520037 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.795440 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-6z2wg"] Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.803756 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:54:00 crc kubenswrapper[4856]: I0320 13:54:00.974745 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" event={"ID":"b5c2fe13-4636-47e7-a767-5756313be1e2","Type":"ContainerStarted","Data":"f7b5d86ae55eb9da3cf9f71f60b39945a728d1a2d0cf70c8ee184df0e0c22e0d"} Mar 20 13:54:03 crc kubenswrapper[4856]: I0320 13:54:02.999654 4856 generic.go:334] "Generic (PLEG): container finished" podID="b5c2fe13-4636-47e7-a767-5756313be1e2" containerID="d8e121aeb28378339b265763f110f9179a4a213930ac1e352d0fbe0ec10ab645" exitCode=0 Mar 20 13:54:03 crc kubenswrapper[4856]: I0320 13:54:02.999792 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" event={"ID":"b5c2fe13-4636-47e7-a767-5756313be1e2","Type":"ContainerDied","Data":"d8e121aeb28378339b265763f110f9179a4a213930ac1e352d0fbe0ec10ab645"} Mar 20 13:54:04 crc kubenswrapper[4856]: I0320 13:54:04.358053 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:04 crc kubenswrapper[4856]: I0320 13:54:04.463841 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts5hr\" (UniqueName: \"kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr\") pod \"b5c2fe13-4636-47e7-a767-5756313be1e2\" (UID: \"b5c2fe13-4636-47e7-a767-5756313be1e2\") " Mar 20 13:54:04 crc kubenswrapper[4856]: I0320 13:54:04.470094 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr" (OuterVolumeSpecName: "kube-api-access-ts5hr") pod "b5c2fe13-4636-47e7-a767-5756313be1e2" (UID: "b5c2fe13-4636-47e7-a767-5756313be1e2"). InnerVolumeSpecName "kube-api-access-ts5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:04 crc kubenswrapper[4856]: I0320 13:54:04.565717 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts5hr\" (UniqueName: \"kubernetes.io/projected/b5c2fe13-4636-47e7-a767-5756313be1e2-kube-api-access-ts5hr\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.018953 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" event={"ID":"b5c2fe13-4636-47e7-a767-5756313be1e2","Type":"ContainerDied","Data":"f7b5d86ae55eb9da3cf9f71f60b39945a728d1a2d0cf70c8ee184df0e0c22e0d"} Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.019014 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7b5d86ae55eb9da3cf9f71f60b39945a728d1a2d0cf70c8ee184df0e0c22e0d" Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.019019 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-6z2wg" Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.449253 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dfsv2"] Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.458665 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dfsv2"] Mar 20 13:54:05 crc kubenswrapper[4856]: I0320 13:54:05.830509 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="facc5117-d669-458c-b9fc-33d0e67c4610" path="/var/lib/kubelet/pods/facc5117-d669-458c-b9fc-33d0e67c4610/volumes" Mar 20 13:54:10 crc kubenswrapper[4856]: I0320 13:54:10.819668 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:54:10 crc kubenswrapper[4856]: E0320 13:54:10.820301 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:54:24 crc kubenswrapper[4856]: I0320 13:54:24.820716 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:54:24 crc kubenswrapper[4856]: E0320 13:54:24.821868 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:54:32 crc kubenswrapper[4856]: I0320 13:54:32.850950 4856 scope.go:117] "RemoveContainer" containerID="8d752ab76f5a06d13b2f52ba3237d5019795b8648083187e81109c9309f5766b" Mar 20 13:54:32 crc kubenswrapper[4856]: I0320 13:54:32.911856 4856 scope.go:117] "RemoveContainer" containerID="1927ae53faa128939cfcd70a23a0ae912818272307357559a80053d78b78317f" Mar 20 13:54:32 crc kubenswrapper[4856]: I0320 13:54:32.949773 4856 scope.go:117] "RemoveContainer" containerID="dfcea8e8e09281bcfc96738b570f941e3374939aa32a47165852731b6ed71dea" Mar 20 13:54:32 crc kubenswrapper[4856]: I0320 13:54:32.972250 4856 scope.go:117] "RemoveContainer" containerID="d27f4b299032daa64a1c2fbb0a6497ce48fae2970eb08d16566671f428d9e3ea" Mar 20 13:54:36 crc kubenswrapper[4856]: I0320 13:54:36.820521 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:54:36 crc kubenswrapper[4856]: E0320 13:54:36.821595 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.197242 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:42 crc kubenswrapper[4856]: E0320 13:54:42.198405 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5c2fe13-4636-47e7-a767-5756313be1e2" containerName="oc" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.198427 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c2fe13-4636-47e7-a767-5756313be1e2" containerName="oc" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.198665 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5c2fe13-4636-47e7-a767-5756313be1e2" containerName="oc" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.200160 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.219207 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.251621 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x54fv\" (UniqueName: \"kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.251904 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.251982 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.353875 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x54fv\" (UniqueName: \"kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.353963 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.353985 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.354639 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.354740 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.383742 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x54fv\" (UniqueName: \"kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv\") pod \"community-operators-wvf4f\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:42 crc kubenswrapper[4856]: I0320 13:54:42.531222 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:43 crc kubenswrapper[4856]: I0320 13:54:43.055416 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:43 crc kubenswrapper[4856]: I0320 13:54:43.442202 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerID="9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0" exitCode=0 Mar 20 13:54:43 crc kubenswrapper[4856]: I0320 13:54:43.442263 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerDied","Data":"9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0"} Mar 20 13:54:43 crc kubenswrapper[4856]: I0320 13:54:43.442762 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerStarted","Data":"0143ae9beb2cfbc0602c44f2a99601adfaad990d57bbb05dc0435fbb2ff3ff2a"} Mar 20 13:54:45 crc kubenswrapper[4856]: I0320 13:54:45.462854 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerStarted","Data":"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87"} Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.007106 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.008983 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.018729 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.125244 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhlx\" (UniqueName: \"kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.125328 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.125621 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.227527 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.227654 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhlx\" (UniqueName: \"kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.227687 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.228258 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.228571 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.273185 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhlx\" (UniqueName: \"kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx\") pod \"certified-operators-95swb\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.340849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.475625 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerID="550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87" exitCode=0 Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.475672 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerDied","Data":"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87"} Mar 20 13:54:46 crc kubenswrapper[4856]: W0320 13:54:46.846334 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd79a9781_855d_4a2a_a820_bcf0108853ec.slice/crio-255a237ffe71bf18a243228cc1636dd75817b44719346d4bdad0572fc95dc992 WatchSource:0}: Error finding container 255a237ffe71bf18a243228cc1636dd75817b44719346d4bdad0572fc95dc992: Status 404 returned error can't find the container with id 255a237ffe71bf18a243228cc1636dd75817b44719346d4bdad0572fc95dc992 Mar 20 13:54:46 crc kubenswrapper[4856]: I0320 13:54:46.847920 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:47 crc kubenswrapper[4856]: I0320 13:54:47.487524 4856 generic.go:334] "Generic (PLEG): container finished" podID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerID="9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120" exitCode=0 Mar 20 13:54:47 crc kubenswrapper[4856]: I0320 13:54:47.487662 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerDied","Data":"9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120"} Mar 20 13:54:47 crc kubenswrapper[4856]: I0320 13:54:47.487962 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerStarted","Data":"255a237ffe71bf18a243228cc1636dd75817b44719346d4bdad0572fc95dc992"} Mar 20 13:54:47 crc kubenswrapper[4856]: I0320 13:54:47.492701 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerStarted","Data":"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da"} Mar 20 13:54:47 crc kubenswrapper[4856]: I0320 13:54:47.528203 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wvf4f" podStartSLOduration=1.894779869 podStartE2EDuration="5.52818307s" podCreationTimestamp="2026-03-20 13:54:42 +0000 UTC" firstStartedPulling="2026-03-20 13:54:43.446085815 +0000 UTC m=+1898.327111985" lastFinishedPulling="2026-03-20 13:54:47.079489056 +0000 UTC m=+1901.960515186" observedRunningTime="2026-03-20 13:54:47.524311984 +0000 UTC m=+1902.405338124" watchObservedRunningTime="2026-03-20 13:54:47.52818307 +0000 UTC m=+1902.409209200" Mar 20 13:54:49 crc kubenswrapper[4856]: I0320 13:54:49.516014 4856 generic.go:334] "Generic (PLEG): container finished" podID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerID="15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93" exitCode=0 Mar 20 13:54:49 crc kubenswrapper[4856]: I0320 13:54:49.516116 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerDied","Data":"15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93"} Mar 20 13:54:50 crc kubenswrapper[4856]: I0320 13:54:50.529043 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerStarted","Data":"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9"} Mar 20 13:54:50 crc kubenswrapper[4856]: I0320 13:54:50.557794 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-95swb" podStartSLOduration=3.088374204 podStartE2EDuration="5.557773607s" podCreationTimestamp="2026-03-20 13:54:45 +0000 UTC" firstStartedPulling="2026-03-20 13:54:47.48940621 +0000 UTC m=+1902.370432350" lastFinishedPulling="2026-03-20 13:54:49.958805603 +0000 UTC m=+1904.839831753" observedRunningTime="2026-03-20 13:54:50.552618325 +0000 UTC m=+1905.433644495" watchObservedRunningTime="2026-03-20 13:54:50.557773607 +0000 UTC m=+1905.438799747" Mar 20 13:54:51 crc kubenswrapper[4856]: I0320 13:54:51.819824 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:54:51 crc kubenswrapper[4856]: E0320 13:54:51.820215 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:54:52 crc kubenswrapper[4856]: I0320 13:54:52.532534 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:52 crc kubenswrapper[4856]: I0320 13:54:52.532651 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:52 crc kubenswrapper[4856]: I0320 13:54:52.582935 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:52 crc kubenswrapper[4856]: I0320 13:54:52.641319 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:53 crc kubenswrapper[4856]: I0320 13:54:53.586093 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:54 crc kubenswrapper[4856]: I0320 13:54:54.565836 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wvf4f" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="registry-server" containerID="cri-o://6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da" gracePeriod=2 Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.064906 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.165299 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x54fv\" (UniqueName: \"kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv\") pod \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.165421 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities\") pod \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.165466 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content\") pod \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\" (UID: \"e2999df9-4345-47b2-8d0a-0ee91bfeefac\") " Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.166582 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities" (OuterVolumeSpecName: "utilities") pod "e2999df9-4345-47b2-8d0a-0ee91bfeefac" (UID: "e2999df9-4345-47b2-8d0a-0ee91bfeefac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.173829 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv" (OuterVolumeSpecName: "kube-api-access-x54fv") pod "e2999df9-4345-47b2-8d0a-0ee91bfeefac" (UID: "e2999df9-4345-47b2-8d0a-0ee91bfeefac"). InnerVolumeSpecName "kube-api-access-x54fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.259350 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2999df9-4345-47b2-8d0a-0ee91bfeefac" (UID: "e2999df9-4345-47b2-8d0a-0ee91bfeefac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.267106 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.267163 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2999df9-4345-47b2-8d0a-0ee91bfeefac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.267189 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x54fv\" (UniqueName: \"kubernetes.io/projected/e2999df9-4345-47b2-8d0a-0ee91bfeefac-kube-api-access-x54fv\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.579887 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerID="6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da" exitCode=0 Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.579934 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerDied","Data":"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da"} Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.579965 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wvf4f" event={"ID":"e2999df9-4345-47b2-8d0a-0ee91bfeefac","Type":"ContainerDied","Data":"0143ae9beb2cfbc0602c44f2a99601adfaad990d57bbb05dc0435fbb2ff3ff2a"} Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.579970 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wvf4f" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.579988 4856 scope.go:117] "RemoveContainer" containerID="6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.620853 4856 scope.go:117] "RemoveContainer" containerID="550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.639618 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.650084 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wvf4f"] Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.658787 4856 scope.go:117] "RemoveContainer" containerID="9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.696543 4856 scope.go:117] "RemoveContainer" containerID="6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da" Mar 20 13:54:55 crc kubenswrapper[4856]: E0320 13:54:55.696984 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da\": container with ID starting with 6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da not found: ID does not exist" containerID="6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.697027 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da"} err="failed to get container status \"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da\": rpc error: code = NotFound desc = could not find container \"6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da\": container with ID starting with 6e8e16d703edaa6baace756cd0e67dfadbc8acd8f5fc8751e9c14d1871e349da not found: ID does not exist" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.697054 4856 scope.go:117] "RemoveContainer" containerID="550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87" Mar 20 13:54:55 crc kubenswrapper[4856]: E0320 13:54:55.697575 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87\": container with ID starting with 550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87 not found: ID does not exist" containerID="550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.697627 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87"} err="failed to get container status \"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87\": rpc error: code = NotFound desc = could not find container \"550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87\": container with ID starting with 550437e3758cd2d67354aeabf63c7be590204980cf0b8914114615ba923d9e87 not found: ID does not exist" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.697658 4856 scope.go:117] "RemoveContainer" containerID="9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0" Mar 20 13:54:55 crc kubenswrapper[4856]: E0320 13:54:55.698051 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0\": container with ID starting with 9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0 not found: ID does not exist" containerID="9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.698119 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0"} err="failed to get container status \"9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0\": rpc error: code = NotFound desc = could not find container \"9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0\": container with ID starting with 9bfcb891fd49e901392b03db06d08972b35a92f00d425f1c5f607bfb2d5373f0 not found: ID does not exist" Mar 20 13:54:55 crc kubenswrapper[4856]: I0320 13:54:55.835637 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" path="/var/lib/kubelet/pods/e2999df9-4345-47b2-8d0a-0ee91bfeefac/volumes" Mar 20 13:54:56 crc kubenswrapper[4856]: I0320 13:54:56.342095 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:56 crc kubenswrapper[4856]: I0320 13:54:56.342162 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:56 crc kubenswrapper[4856]: I0320 13:54:56.429689 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:56 crc kubenswrapper[4856]: I0320 13:54:56.648427 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:57 crc kubenswrapper[4856]: I0320 13:54:57.991118 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:58 crc kubenswrapper[4856]: I0320 13:54:58.612976 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-95swb" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="registry-server" containerID="cri-o://4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9" gracePeriod=2 Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.051923 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.127700 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities\") pod \"d79a9781-855d-4a2a-a820-bcf0108853ec\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.127889 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content\") pod \"d79a9781-855d-4a2a-a820-bcf0108853ec\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.128003 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhlx\" (UniqueName: \"kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx\") pod \"d79a9781-855d-4a2a-a820-bcf0108853ec\" (UID: \"d79a9781-855d-4a2a-a820-bcf0108853ec\") " Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.134681 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx" (OuterVolumeSpecName: "kube-api-access-rvhlx") pod "d79a9781-855d-4a2a-a820-bcf0108853ec" (UID: "d79a9781-855d-4a2a-a820-bcf0108853ec"). InnerVolumeSpecName "kube-api-access-rvhlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.137021 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities" (OuterVolumeSpecName: "utilities") pod "d79a9781-855d-4a2a-a820-bcf0108853ec" (UID: "d79a9781-855d-4a2a-a820-bcf0108853ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.205660 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d79a9781-855d-4a2a-a820-bcf0108853ec" (UID: "d79a9781-855d-4a2a-a820-bcf0108853ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.229927 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.229964 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhlx\" (UniqueName: \"kubernetes.io/projected/d79a9781-855d-4a2a-a820-bcf0108853ec-kube-api-access-rvhlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.229980 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d79a9781-855d-4a2a-a820-bcf0108853ec-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.624061 4856 generic.go:334] "Generic (PLEG): container finished" podID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerID="4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9" exitCode=0 Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.624102 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerDied","Data":"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9"} Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.624113 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-95swb" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.624129 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-95swb" event={"ID":"d79a9781-855d-4a2a-a820-bcf0108853ec","Type":"ContainerDied","Data":"255a237ffe71bf18a243228cc1636dd75817b44719346d4bdad0572fc95dc992"} Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.624147 4856 scope.go:117] "RemoveContainer" containerID="4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.666606 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.671391 4856 scope.go:117] "RemoveContainer" containerID="15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.676576 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-95swb"] Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.703673 4856 scope.go:117] "RemoveContainer" containerID="9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.741350 4856 scope.go:117] "RemoveContainer" containerID="4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9" Mar 20 13:54:59 crc kubenswrapper[4856]: E0320 13:54:59.741806 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9\": container with ID starting with 4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9 not found: ID does not exist" containerID="4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.741851 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9"} err="failed to get container status \"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9\": rpc error: code = NotFound desc = could not find container \"4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9\": container with ID starting with 4a2624b733733e0bcafe3aa82945af83c1a997c7c4e1e74a385fcb8311711db9 not found: ID does not exist" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.741877 4856 scope.go:117] "RemoveContainer" containerID="15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93" Mar 20 13:54:59 crc kubenswrapper[4856]: E0320 13:54:59.742089 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93\": container with ID starting with 15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93 not found: ID does not exist" containerID="15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.742113 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93"} err="failed to get container status \"15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93\": rpc error: code = NotFound desc = could not find container \"15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93\": container with ID starting with 15327dce49dd704b2d27c2a1fd848d578fe153884b14db22c4172b8bd6a19f93 not found: ID does not exist" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.742126 4856 scope.go:117] "RemoveContainer" containerID="9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120" Mar 20 13:54:59 crc kubenswrapper[4856]: E0320 13:54:59.742326 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120\": container with ID starting with 9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120 not found: ID does not exist" containerID="9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.742345 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120"} err="failed to get container status \"9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120\": rpc error: code = NotFound desc = could not find container \"9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120\": container with ID starting with 9ec5a774daf5a1addc645a3a1295960b847649cdbb0048d153c30baf22afe120 not found: ID does not exist" Mar 20 13:54:59 crc kubenswrapper[4856]: I0320 13:54:59.833095 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" path="/var/lib/kubelet/pods/d79a9781-855d-4a2a-a820-bcf0108853ec/volumes" Mar 20 13:55:06 crc kubenswrapper[4856]: I0320 13:55:06.819524 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:55:06 crc kubenswrapper[4856]: E0320 13:55:06.820105 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:55:19 crc kubenswrapper[4856]: I0320 13:55:19.820177 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:55:19 crc kubenswrapper[4856]: E0320 13:55:19.821461 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:55:32 crc kubenswrapper[4856]: I0320 13:55:32.819881 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:55:32 crc kubenswrapper[4856]: E0320 13:55:32.820501 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:55:43 crc kubenswrapper[4856]: I0320 13:55:43.820105 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:55:43 crc kubenswrapper[4856]: E0320 13:55:43.821225 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:55:54 crc kubenswrapper[4856]: I0320 13:55:54.819690 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:55:54 crc kubenswrapper[4856]: E0320 13:55:54.820113 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.146944 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-zv78r"] Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147752 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147765 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147777 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147784 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147794 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147800 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="extract-content" Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147810 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147815 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="extract-utilities" Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147830 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147836 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: E0320 13:56:00.147853 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147858 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147982 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d79a9781-855d-4a2a-a820-bcf0108853ec" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.147995 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2999df9-4345-47b2-8d0a-0ee91bfeefac" containerName="registry-server" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.148903 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.151745 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.151747 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.152365 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.161615 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-zv78r"] Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.339266 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbdgz\" (UniqueName: \"kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz\") pod \"auto-csr-approver-29566916-zv78r\" (UID: \"e5f018f4-64b2-4211-aad2-5f3f0865d3c1\") " pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.440902 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbdgz\" (UniqueName: \"kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz\") pod \"auto-csr-approver-29566916-zv78r\" (UID: \"e5f018f4-64b2-4211-aad2-5f3f0865d3c1\") " pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.476865 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbdgz\" (UniqueName: \"kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz\") pod \"auto-csr-approver-29566916-zv78r\" (UID: \"e5f018f4-64b2-4211-aad2-5f3f0865d3c1\") " pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:00 crc kubenswrapper[4856]: I0320 13:56:00.770863 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:01 crc kubenswrapper[4856]: I0320 13:56:01.225912 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-zv78r"] Mar 20 13:56:02 crc kubenswrapper[4856]: I0320 13:56:02.164895 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-zv78r" event={"ID":"e5f018f4-64b2-4211-aad2-5f3f0865d3c1","Type":"ContainerStarted","Data":"c04d7f0a5847cc806f12e475b2c3032bf0375f943646ce72c34f095c6e1c7c7f"} Mar 20 13:56:03 crc kubenswrapper[4856]: I0320 13:56:03.178192 4856 generic.go:334] "Generic (PLEG): container finished" podID="e5f018f4-64b2-4211-aad2-5f3f0865d3c1" containerID="603821a4df46497e0fb06e6421765c404890f6e0e5e063fea08d49df33ce7688" exitCode=0 Mar 20 13:56:03 crc kubenswrapper[4856]: I0320 13:56:03.178318 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-zv78r" event={"ID":"e5f018f4-64b2-4211-aad2-5f3f0865d3c1","Type":"ContainerDied","Data":"603821a4df46497e0fb06e6421765c404890f6e0e5e063fea08d49df33ce7688"} Mar 20 13:56:04 crc kubenswrapper[4856]: I0320 13:56:04.570205 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:04 crc kubenswrapper[4856]: I0320 13:56:04.706712 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbdgz\" (UniqueName: \"kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz\") pod \"e5f018f4-64b2-4211-aad2-5f3f0865d3c1\" (UID: \"e5f018f4-64b2-4211-aad2-5f3f0865d3c1\") " Mar 20 13:56:04 crc kubenswrapper[4856]: I0320 13:56:04.716441 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz" (OuterVolumeSpecName: "kube-api-access-wbdgz") pod "e5f018f4-64b2-4211-aad2-5f3f0865d3c1" (UID: "e5f018f4-64b2-4211-aad2-5f3f0865d3c1"). InnerVolumeSpecName "kube-api-access-wbdgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:04 crc kubenswrapper[4856]: I0320 13:56:04.809113 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbdgz\" (UniqueName: \"kubernetes.io/projected/e5f018f4-64b2-4211-aad2-5f3f0865d3c1-kube-api-access-wbdgz\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.198977 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-zv78r" event={"ID":"e5f018f4-64b2-4211-aad2-5f3f0865d3c1","Type":"ContainerDied","Data":"c04d7f0a5847cc806f12e475b2c3032bf0375f943646ce72c34f095c6e1c7c7f"} Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.199364 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c04d7f0a5847cc806f12e475b2c3032bf0375f943646ce72c34f095c6e1c7c7f" Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.199050 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-zv78r" Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.643133 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-b9x2v"] Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.649942 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-b9x2v"] Mar 20 13:56:05 crc kubenswrapper[4856]: I0320 13:56:05.851289 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60869252-c55a-4e0a-a06d-9d57b6539209" path="/var/lib/kubelet/pods/60869252-c55a-4e0a-a06d-9d57b6539209/volumes" Mar 20 13:56:06 crc kubenswrapper[4856]: I0320 13:56:06.821830 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:56:06 crc kubenswrapper[4856]: E0320 13:56:06.822797 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:56:19 crc kubenswrapper[4856]: I0320 13:56:19.820155 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:56:19 crc kubenswrapper[4856]: E0320 13:56:19.821206 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:56:33 crc kubenswrapper[4856]: I0320 13:56:33.081622 4856 scope.go:117] "RemoveContainer" containerID="a9476a66d4cd3f2685d44b05c73737f808a2bf651148a8eb37f7976db65af7fb" Mar 20 13:56:33 crc kubenswrapper[4856]: I0320 13:56:33.821080 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:56:33 crc kubenswrapper[4856]: E0320 13:56:33.821853 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:56:46 crc kubenswrapper[4856]: I0320 13:56:46.822206 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:56:46 crc kubenswrapper[4856]: E0320 13:56:46.826316 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:56:59 crc kubenswrapper[4856]: I0320 13:56:59.820911 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:56:59 crc kubenswrapper[4856]: E0320 13:56:59.822833 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:57:11 crc kubenswrapper[4856]: I0320 13:57:11.820115 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:57:11 crc kubenswrapper[4856]: E0320 13:57:11.821211 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:57:26 crc kubenswrapper[4856]: I0320 13:57:26.824020 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:57:26 crc kubenswrapper[4856]: E0320 13:57:26.826493 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:57:38 crc kubenswrapper[4856]: I0320 13:57:38.820408 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:57:38 crc kubenswrapper[4856]: E0320 13:57:38.821293 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 13:57:53 crc kubenswrapper[4856]: I0320 13:57:53.819988 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 13:57:54 crc kubenswrapper[4856]: I0320 13:57:54.109040 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66"} Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.153156 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566918-mvnqw"] Mar 20 13:58:00 crc kubenswrapper[4856]: E0320 13:58:00.154550 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f018f4-64b2-4211-aad2-5f3f0865d3c1" containerName="oc" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.154577 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f018f4-64b2-4211-aad2-5f3f0865d3c1" containerName="oc" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.154847 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f018f4-64b2-4211-aad2-5f3f0865d3c1" containerName="oc" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.155629 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.158592 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.162747 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.163832 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.165672 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-mvnqw"] Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.284569 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gb4x\" (UniqueName: \"kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x\") pod \"auto-csr-approver-29566918-mvnqw\" (UID: \"cc9f4824-1cd5-46a8-93b4-7b733d06878b\") " pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.386174 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gb4x\" (UniqueName: \"kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x\") pod \"auto-csr-approver-29566918-mvnqw\" (UID: \"cc9f4824-1cd5-46a8-93b4-7b733d06878b\") " pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.418438 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gb4x\" (UniqueName: \"kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x\") pod \"auto-csr-approver-29566918-mvnqw\" (UID: \"cc9f4824-1cd5-46a8-93b4-7b733d06878b\") " pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.504048 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:00 crc kubenswrapper[4856]: I0320 13:58:00.966178 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-mvnqw"] Mar 20 13:58:01 crc kubenswrapper[4856]: I0320 13:58:01.190152 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" event={"ID":"cc9f4824-1cd5-46a8-93b4-7b733d06878b","Type":"ContainerStarted","Data":"d13513e3b6839c4957e9ea503462d8c32f5b563f7fffa834d221bb45f6e973de"} Mar 20 13:58:03 crc kubenswrapper[4856]: I0320 13:58:03.212144 4856 generic.go:334] "Generic (PLEG): container finished" podID="cc9f4824-1cd5-46a8-93b4-7b733d06878b" containerID="cddf6da8cd436a3fce4b8322960949fb844071ad7e4341bac5f4c52952c60a23" exitCode=0 Mar 20 13:58:03 crc kubenswrapper[4856]: I0320 13:58:03.212250 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" event={"ID":"cc9f4824-1cd5-46a8-93b4-7b733d06878b","Type":"ContainerDied","Data":"cddf6da8cd436a3fce4b8322960949fb844071ad7e4341bac5f4c52952c60a23"} Mar 20 13:58:04 crc kubenswrapper[4856]: I0320 13:58:04.539326 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:04 crc kubenswrapper[4856]: I0320 13:58:04.649682 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gb4x\" (UniqueName: \"kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x\") pod \"cc9f4824-1cd5-46a8-93b4-7b733d06878b\" (UID: \"cc9f4824-1cd5-46a8-93b4-7b733d06878b\") " Mar 20 13:58:04 crc kubenswrapper[4856]: I0320 13:58:04.656799 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x" (OuterVolumeSpecName: "kube-api-access-9gb4x") pod "cc9f4824-1cd5-46a8-93b4-7b733d06878b" (UID: "cc9f4824-1cd5-46a8-93b4-7b733d06878b"). InnerVolumeSpecName "kube-api-access-9gb4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:04 crc kubenswrapper[4856]: I0320 13:58:04.751262 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gb4x\" (UniqueName: \"kubernetes.io/projected/cc9f4824-1cd5-46a8-93b4-7b733d06878b-kube-api-access-9gb4x\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.233242 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" event={"ID":"cc9f4824-1cd5-46a8-93b4-7b733d06878b","Type":"ContainerDied","Data":"d13513e3b6839c4957e9ea503462d8c32f5b563f7fffa834d221bb45f6e973de"} Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.233364 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13513e3b6839c4957e9ea503462d8c32f5b563f7fffa834d221bb45f6e973de" Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.233332 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-mvnqw" Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.615236 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-99ctf"] Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.620437 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-99ctf"] Mar 20 13:58:05 crc kubenswrapper[4856]: I0320 13:58:05.829659 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef24db81-268d-49ad-92b6-79c79398fb22" path="/var/lib/kubelet/pods/ef24db81-268d-49ad-92b6-79c79398fb22/volumes" Mar 20 13:58:33 crc kubenswrapper[4856]: I0320 13:58:33.192975 4856 scope.go:117] "RemoveContainer" containerID="0446fc63578af9c9d70788f4526df6ed94d8ad6ca8b4cb119178c67860206c97" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.420631 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:10 crc kubenswrapper[4856]: E0320 13:59:10.421696 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9f4824-1cd5-46a8-93b4-7b733d06878b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.421709 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9f4824-1cd5-46a8-93b4-7b733d06878b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.421896 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9f4824-1cd5-46a8-93b4-7b733d06878b" containerName="oc" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.422829 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.449724 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.471327 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.471475 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppzpz\" (UniqueName: \"kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.471514 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.573090 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.573185 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppzpz\" (UniqueName: \"kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.573210 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.573974 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.573988 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.595651 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppzpz\" (UniqueName: \"kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz\") pod \"redhat-operators-grsg4\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:10 crc kubenswrapper[4856]: I0320 13:59:10.748389 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:11 crc kubenswrapper[4856]: I0320 13:59:11.229515 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:11 crc kubenswrapper[4856]: I0320 13:59:11.852384 4856 generic.go:334] "Generic (PLEG): container finished" podID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerID="8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce" exitCode=0 Mar 20 13:59:11 crc kubenswrapper[4856]: I0320 13:59:11.852451 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerDied","Data":"8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce"} Mar 20 13:59:11 crc kubenswrapper[4856]: I0320 13:59:11.852652 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerStarted","Data":"cd5db6425f1e7740ce27979b994ac3f30057ec90d41f4773d1333ed0784a7d4d"} Mar 20 13:59:11 crc kubenswrapper[4856]: I0320 13:59:11.854586 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:59:13 crc kubenswrapper[4856]: I0320 13:59:13.874221 4856 generic.go:334] "Generic (PLEG): container finished" podID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerID="9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e" exitCode=0 Mar 20 13:59:13 crc kubenswrapper[4856]: I0320 13:59:13.874348 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerDied","Data":"9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e"} Mar 20 13:59:14 crc kubenswrapper[4856]: I0320 13:59:14.885121 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerStarted","Data":"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007"} Mar 20 13:59:14 crc kubenswrapper[4856]: I0320 13:59:14.909346 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-grsg4" podStartSLOduration=2.407233369 podStartE2EDuration="4.909320718s" podCreationTimestamp="2026-03-20 13:59:10 +0000 UTC" firstStartedPulling="2026-03-20 13:59:11.85424752 +0000 UTC m=+2166.735273660" lastFinishedPulling="2026-03-20 13:59:14.356334879 +0000 UTC m=+2169.237361009" observedRunningTime="2026-03-20 13:59:14.900760384 +0000 UTC m=+2169.781786534" watchObservedRunningTime="2026-03-20 13:59:14.909320718 +0000 UTC m=+2169.790346888" Mar 20 13:59:20 crc kubenswrapper[4856]: I0320 13:59:20.749416 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:20 crc kubenswrapper[4856]: I0320 13:59:20.750151 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:21 crc kubenswrapper[4856]: I0320 13:59:21.806788 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-grsg4" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="registry-server" probeResult="failure" output=< Mar 20 13:59:21 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 13:59:21 crc kubenswrapper[4856]: > Mar 20 13:59:30 crc kubenswrapper[4856]: I0320 13:59:30.812227 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:30 crc kubenswrapper[4856]: I0320 13:59:30.884534 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:31 crc kubenswrapper[4856]: I0320 13:59:31.062099 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.031734 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-grsg4" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="registry-server" containerID="cri-o://525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007" gracePeriod=2 Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.514849 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.602026 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content\") pod \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.602215 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities\") pod \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.602240 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppzpz\" (UniqueName: \"kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz\") pod \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\" (UID: \"2dac2c82-0209-46ab-860a-7536fd9f6bd0\") " Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.603394 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities" (OuterVolumeSpecName: "utilities") pod "2dac2c82-0209-46ab-860a-7536fd9f6bd0" (UID: "2dac2c82-0209-46ab-860a-7536fd9f6bd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.609567 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz" (OuterVolumeSpecName: "kube-api-access-ppzpz") pod "2dac2c82-0209-46ab-860a-7536fd9f6bd0" (UID: "2dac2c82-0209-46ab-860a-7536fd9f6bd0"). InnerVolumeSpecName "kube-api-access-ppzpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.703309 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.703346 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppzpz\" (UniqueName: \"kubernetes.io/projected/2dac2c82-0209-46ab-860a-7536fd9f6bd0-kube-api-access-ppzpz\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.744921 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2dac2c82-0209-46ab-860a-7536fd9f6bd0" (UID: "2dac2c82-0209-46ab-860a-7536fd9f6bd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:32 crc kubenswrapper[4856]: I0320 13:59:32.805118 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2dac2c82-0209-46ab-860a-7536fd9f6bd0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.042455 4856 generic.go:334] "Generic (PLEG): container finished" podID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerID="525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007" exitCode=0 Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.042510 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-grsg4" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.042534 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerDied","Data":"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007"} Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.042934 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-grsg4" event={"ID":"2dac2c82-0209-46ab-860a-7536fd9f6bd0","Type":"ContainerDied","Data":"cd5db6425f1e7740ce27979b994ac3f30057ec90d41f4773d1333ed0784a7d4d"} Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.042954 4856 scope.go:117] "RemoveContainer" containerID="525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.061831 4856 scope.go:117] "RemoveContainer" containerID="9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.073716 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.078984 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-grsg4"] Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.093554 4856 scope.go:117] "RemoveContainer" containerID="8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.114401 4856 scope.go:117] "RemoveContainer" containerID="525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007" Mar 20 13:59:33 crc kubenswrapper[4856]: E0320 13:59:33.114975 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007\": container with ID starting with 525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007 not found: ID does not exist" containerID="525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.115025 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007"} err="failed to get container status \"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007\": rpc error: code = NotFound desc = could not find container \"525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007\": container with ID starting with 525823262f2f4c54aff3a9486a3a45167516c71e1f167568d4e5d20b0345e007 not found: ID does not exist" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.115050 4856 scope.go:117] "RemoveContainer" containerID="9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e" Mar 20 13:59:33 crc kubenswrapper[4856]: E0320 13:59:33.115412 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e\": container with ID starting with 9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e not found: ID does not exist" containerID="9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.115444 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e"} err="failed to get container status \"9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e\": rpc error: code = NotFound desc = could not find container \"9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e\": container with ID starting with 9aeb6cfc0d61b2ab18cc6ecb086d3910a9f3341a2abf16cc0dde2e5a8fe58f6e not found: ID does not exist" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.115465 4856 scope.go:117] "RemoveContainer" containerID="8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce" Mar 20 13:59:33 crc kubenswrapper[4856]: E0320 13:59:33.115684 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce\": container with ID starting with 8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce not found: ID does not exist" containerID="8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.115745 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce"} err="failed to get container status \"8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce\": rpc error: code = NotFound desc = could not find container \"8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce\": container with ID starting with 8dd6375364ba25127b7f539bfbf02d5e0ae33cfb4a9c04601a201077e3dca8ce not found: ID does not exist" Mar 20 13:59:33 crc kubenswrapper[4856]: I0320 13:59:33.829914 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" path="/var/lib/kubelet/pods/2dac2c82-0209-46ab-860a-7536fd9f6bd0/volumes" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.145848 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566920-jg6xl"] Mar 20 14:00:00 crc kubenswrapper[4856]: E0320 14:00:00.147009 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.147032 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4856]: E0320 14:00:00.147064 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.147075 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4856]: E0320 14:00:00.147091 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.147102 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.147406 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dac2c82-0209-46ab-860a-7536fd9f6bd0" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.148109 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.150820 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.150981 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.152640 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.158358 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs"] Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.159363 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.161727 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.161959 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.164044 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-jg6xl"] Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.171665 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs"] Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.299092 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.299133 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.299160 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhnn\" (UniqueName: \"kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.299437 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ptkb\" (UniqueName: \"kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb\") pod \"auto-csr-approver-29566920-jg6xl\" (UID: \"a97624a1-8a30-4c93-8fc8-7995e78480ae\") " pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.401096 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ptkb\" (UniqueName: \"kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb\") pod \"auto-csr-approver-29566920-jg6xl\" (UID: \"a97624a1-8a30-4c93-8fc8-7995e78480ae\") " pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.401197 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.401239 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.401264 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfhnn\" (UniqueName: \"kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.402220 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.412736 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.421573 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfhnn\" (UniqueName: \"kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn\") pod \"collect-profiles-29566920-vv5hs\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.422588 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ptkb\" (UniqueName: \"kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb\") pod \"auto-csr-approver-29566920-jg6xl\" (UID: \"a97624a1-8a30-4c93-8fc8-7995e78480ae\") " pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.473705 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.480180 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.897446 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs"] Mar 20 14:00:00 crc kubenswrapper[4856]: W0320 14:00:00.970750 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97624a1_8a30_4c93_8fc8_7995e78480ae.slice/crio-bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b WatchSource:0}: Error finding container bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b: Status 404 returned error can't find the container with id bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b Mar 20 14:00:00 crc kubenswrapper[4856]: I0320 14:00:00.970819 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-jg6xl"] Mar 20 14:00:01 crc kubenswrapper[4856]: I0320 14:00:01.268871 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e31f256-88cb-4b92-948d-21602388727a" containerID="b16de28596509ca45de50ad52974e3e732329e5400c9ea35da97ba81b59777b9" exitCode=0 Mar 20 14:00:01 crc kubenswrapper[4856]: I0320 14:00:01.268955 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" event={"ID":"3e31f256-88cb-4b92-948d-21602388727a","Type":"ContainerDied","Data":"b16de28596509ca45de50ad52974e3e732329e5400c9ea35da97ba81b59777b9"} Mar 20 14:00:01 crc kubenswrapper[4856]: I0320 14:00:01.268984 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" event={"ID":"3e31f256-88cb-4b92-948d-21602388727a","Type":"ContainerStarted","Data":"1da99ad3831dddd153ad13e4651b5cf3bec5e2b26cae471023f77f3f93bdf8ed"} Mar 20 14:00:01 crc kubenswrapper[4856]: I0320 14:00:01.270030 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" event={"ID":"a97624a1-8a30-4c93-8fc8-7995e78480ae","Type":"ContainerStarted","Data":"bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b"} Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.561384 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.732207 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfhnn\" (UniqueName: \"kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn\") pod \"3e31f256-88cb-4b92-948d-21602388727a\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.732362 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume\") pod \"3e31f256-88cb-4b92-948d-21602388727a\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.732501 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume\") pod \"3e31f256-88cb-4b92-948d-21602388727a\" (UID: \"3e31f256-88cb-4b92-948d-21602388727a\") " Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.733201 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e31f256-88cb-4b92-948d-21602388727a" (UID: "3e31f256-88cb-4b92-948d-21602388727a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.741106 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e31f256-88cb-4b92-948d-21602388727a" (UID: "3e31f256-88cb-4b92-948d-21602388727a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.741773 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn" (OuterVolumeSpecName: "kube-api-access-bfhnn") pod "3e31f256-88cb-4b92-948d-21602388727a" (UID: "3e31f256-88cb-4b92-948d-21602388727a"). InnerVolumeSpecName "kube-api-access-bfhnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.834452 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfhnn\" (UniqueName: \"kubernetes.io/projected/3e31f256-88cb-4b92-948d-21602388727a-kube-api-access-bfhnn\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.834497 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e31f256-88cb-4b92-948d-21602388727a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:02 crc kubenswrapper[4856]: I0320 14:00:02.834511 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e31f256-88cb-4b92-948d-21602388727a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.288125 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" event={"ID":"3e31f256-88cb-4b92-948d-21602388727a","Type":"ContainerDied","Data":"1da99ad3831dddd153ad13e4651b5cf3bec5e2b26cae471023f77f3f93bdf8ed"} Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.288182 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1da99ad3831dddd153ad13e4651b5cf3bec5e2b26cae471023f77f3f93bdf8ed" Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.288182 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs" Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.624913 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh"] Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.634348 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-w5vwh"] Mar 20 14:00:03 crc kubenswrapper[4856]: I0320 14:00:03.830197 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0acb35-a304-43e3-8306-5c5319d0e8e8" path="/var/lib/kubelet/pods/1f0acb35-a304-43e3-8306-5c5319d0e8e8/volumes" Mar 20 14:00:05 crc kubenswrapper[4856]: I0320 14:00:05.304292 4856 generic.go:334] "Generic (PLEG): container finished" podID="a97624a1-8a30-4c93-8fc8-7995e78480ae" containerID="0f18f686dd1805405b7f3d9136d35be1af26594a8d887aeebd7dba8fc9bc1c78" exitCode=0 Mar 20 14:00:05 crc kubenswrapper[4856]: I0320 14:00:05.304336 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" event={"ID":"a97624a1-8a30-4c93-8fc8-7995e78480ae","Type":"ContainerDied","Data":"0f18f686dd1805405b7f3d9136d35be1af26594a8d887aeebd7dba8fc9bc1c78"} Mar 20 14:00:06 crc kubenswrapper[4856]: I0320 14:00:06.648431 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:06 crc kubenswrapper[4856]: I0320 14:00:06.789401 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ptkb\" (UniqueName: \"kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb\") pod \"a97624a1-8a30-4c93-8fc8-7995e78480ae\" (UID: \"a97624a1-8a30-4c93-8fc8-7995e78480ae\") " Mar 20 14:00:06 crc kubenswrapper[4856]: I0320 14:00:06.795401 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb" (OuterVolumeSpecName: "kube-api-access-7ptkb") pod "a97624a1-8a30-4c93-8fc8-7995e78480ae" (UID: "a97624a1-8a30-4c93-8fc8-7995e78480ae"). InnerVolumeSpecName "kube-api-access-7ptkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:06 crc kubenswrapper[4856]: I0320 14:00:06.890768 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ptkb\" (UniqueName: \"kubernetes.io/projected/a97624a1-8a30-4c93-8fc8-7995e78480ae-kube-api-access-7ptkb\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.323243 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" event={"ID":"a97624a1-8a30-4c93-8fc8-7995e78480ae","Type":"ContainerDied","Data":"bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b"} Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.323308 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2bef3b9fd15843539d85f0833730d2069ffd20c9c259a62d7fa6a1cf2c681b" Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.323378 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-jg6xl" Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.701584 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-6z2wg"] Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.707244 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-6z2wg"] Mar 20 14:00:07 crc kubenswrapper[4856]: I0320 14:00:07.828761 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c2fe13-4636-47e7-a767-5756313be1e2" path="/var/lib/kubelet/pods/b5c2fe13-4636-47e7-a767-5756313be1e2/volumes" Mar 20 14:00:09 crc kubenswrapper[4856]: I0320 14:00:09.987622 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:00:09 crc kubenswrapper[4856]: I0320 14:00:09.987824 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:00:33 crc kubenswrapper[4856]: I0320 14:00:33.307134 4856 scope.go:117] "RemoveContainer" containerID="d8e121aeb28378339b265763f110f9179a4a213930ac1e352d0fbe0ec10ab645" Mar 20 14:00:33 crc kubenswrapper[4856]: I0320 14:00:33.360813 4856 scope.go:117] "RemoveContainer" containerID="2f1a52b951c5480ea3fd08fab44d8c11ef8195ee8461d4246c8df67127bf2522" Mar 20 14:00:39 crc kubenswrapper[4856]: I0320 14:00:39.987958 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:00:39 crc kubenswrapper[4856]: I0320 14:00:39.988430 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:01:09 crc kubenswrapper[4856]: I0320 14:01:09.987148 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:01:09 crc kubenswrapper[4856]: I0320 14:01:09.987742 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:01:09 crc kubenswrapper[4856]: I0320 14:01:09.987791 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:01:09 crc kubenswrapper[4856]: I0320 14:01:09.988531 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:01:09 crc kubenswrapper[4856]: I0320 14:01:09.988603 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66" gracePeriod=600 Mar 20 14:01:10 crc kubenswrapper[4856]: I0320 14:01:10.849554 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66" exitCode=0 Mar 20 14:01:10 crc kubenswrapper[4856]: I0320 14:01:10.849666 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66"} Mar 20 14:01:10 crc kubenswrapper[4856]: I0320 14:01:10.850893 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06"} Mar 20 14:01:10 crc kubenswrapper[4856]: I0320 14:01:10.850938 4856 scope.go:117] "RemoveContainer" containerID="0dc046131359332ed1376bdc46f5b0b52562fa5c630e2713fb95ca1d1663bd3e" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.223654 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:24 crc kubenswrapper[4856]: E0320 14:01:24.225904 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e31f256-88cb-4b92-948d-21602388727a" containerName="collect-profiles" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.225931 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e31f256-88cb-4b92-948d-21602388727a" containerName="collect-profiles" Mar 20 14:01:24 crc kubenswrapper[4856]: E0320 14:01:24.225958 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97624a1-8a30-4c93-8fc8-7995e78480ae" containerName="oc" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.225970 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97624a1-8a30-4c93-8fc8-7995e78480ae" containerName="oc" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.226241 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e31f256-88cb-4b92-948d-21602388727a" containerName="collect-profiles" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.226260 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97624a1-8a30-4c93-8fc8-7995e78480ae" containerName="oc" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.231814 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.235924 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.275308 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.275383 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gsx\" (UniqueName: \"kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.275463 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.377006 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.377148 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.377182 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gsx\" (UniqueName: \"kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.377656 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.377734 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.404706 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gsx\" (UniqueName: \"kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx\") pod \"redhat-marketplace-7s9rd\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:24 crc kubenswrapper[4856]: I0320 14:01:24.569188 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:25 crc kubenswrapper[4856]: I0320 14:01:25.094133 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:25 crc kubenswrapper[4856]: I0320 14:01:25.998132 4856 generic.go:334] "Generic (PLEG): container finished" podID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerID="453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882" exitCode=0 Mar 20 14:01:25 crc kubenswrapper[4856]: I0320 14:01:25.998304 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerDied","Data":"453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882"} Mar 20 14:01:25 crc kubenswrapper[4856]: I0320 14:01:25.998361 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerStarted","Data":"d1fca10653af2086eacf900450d76d079b2759e97891da5a3d0de3cfa99523d7"} Mar 20 14:01:27 crc kubenswrapper[4856]: I0320 14:01:27.006165 4856 generic.go:334] "Generic (PLEG): container finished" podID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerID="c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b" exitCode=0 Mar 20 14:01:27 crc kubenswrapper[4856]: I0320 14:01:27.006285 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerDied","Data":"c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b"} Mar 20 14:01:29 crc kubenswrapper[4856]: I0320 14:01:29.026793 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerStarted","Data":"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737"} Mar 20 14:01:34 crc kubenswrapper[4856]: I0320 14:01:34.569608 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:34 crc kubenswrapper[4856]: I0320 14:01:34.570078 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:34 crc kubenswrapper[4856]: I0320 14:01:34.619675 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:34 crc kubenswrapper[4856]: I0320 14:01:34.649153 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7s9rd" podStartSLOduration=8.331996143 podStartE2EDuration="10.649134931s" podCreationTimestamp="2026-03-20 14:01:24 +0000 UTC" firstStartedPulling="2026-03-20 14:01:26.000880938 +0000 UTC m=+2300.881907078" lastFinishedPulling="2026-03-20 14:01:28.318019696 +0000 UTC m=+2303.199045866" observedRunningTime="2026-03-20 14:01:29.044383725 +0000 UTC m=+2303.925409885" watchObservedRunningTime="2026-03-20 14:01:34.649134931 +0000 UTC m=+2309.530161061" Mar 20 14:01:35 crc kubenswrapper[4856]: I0320 14:01:35.143424 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:35 crc kubenswrapper[4856]: I0320 14:01:35.201690 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:37 crc kubenswrapper[4856]: I0320 14:01:37.110244 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7s9rd" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="registry-server" containerID="cri-o://ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737" gracePeriod=2 Mar 20 14:01:37 crc kubenswrapper[4856]: I0320 14:01:37.987082 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.077254 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities\") pod \"ae891c4b-5670-4a45-9484-f1e5e70a7553\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.077336 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content\") pod \"ae891c4b-5670-4a45-9484-f1e5e70a7553\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.077393 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9gsx\" (UniqueName: \"kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx\") pod \"ae891c4b-5670-4a45-9484-f1e5e70a7553\" (UID: \"ae891c4b-5670-4a45-9484-f1e5e70a7553\") " Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.078979 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities" (OuterVolumeSpecName: "utilities") pod "ae891c4b-5670-4a45-9484-f1e5e70a7553" (UID: "ae891c4b-5670-4a45-9484-f1e5e70a7553"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.083704 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx" (OuterVolumeSpecName: "kube-api-access-c9gsx") pod "ae891c4b-5670-4a45-9484-f1e5e70a7553" (UID: "ae891c4b-5670-4a45-9484-f1e5e70a7553"). InnerVolumeSpecName "kube-api-access-c9gsx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.108457 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae891c4b-5670-4a45-9484-f1e5e70a7553" (UID: "ae891c4b-5670-4a45-9484-f1e5e70a7553"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.118849 4856 generic.go:334] "Generic (PLEG): container finished" podID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerID="ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737" exitCode=0 Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.118888 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerDied","Data":"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737"} Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.118896 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7s9rd" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.118913 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7s9rd" event={"ID":"ae891c4b-5670-4a45-9484-f1e5e70a7553","Type":"ContainerDied","Data":"d1fca10653af2086eacf900450d76d079b2759e97891da5a3d0de3cfa99523d7"} Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.118929 4856 scope.go:117] "RemoveContainer" containerID="ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.152040 4856 scope.go:117] "RemoveContainer" containerID="c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.153118 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.162685 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7s9rd"] Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.166825 4856 scope.go:117] "RemoveContainer" containerID="453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.179394 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9gsx\" (UniqueName: \"kubernetes.io/projected/ae891c4b-5670-4a45-9484-f1e5e70a7553-kube-api-access-c9gsx\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.179441 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.179456 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae891c4b-5670-4a45-9484-f1e5e70a7553-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.196674 4856 scope.go:117] "RemoveContainer" containerID="ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737" Mar 20 14:01:38 crc kubenswrapper[4856]: E0320 14:01:38.197398 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737\": container with ID starting with ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737 not found: ID does not exist" containerID="ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.197498 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737"} err="failed to get container status \"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737\": rpc error: code = NotFound desc = could not find container \"ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737\": container with ID starting with ad9e4f10c6083b30e0c0cbe06a0e751d869f2864181ac7dfa322611a9d481737 not found: ID does not exist" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.197577 4856 scope.go:117] "RemoveContainer" containerID="c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b" Mar 20 14:01:38 crc kubenswrapper[4856]: E0320 14:01:38.198239 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b\": container with ID starting with c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b not found: ID does not exist" containerID="c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.198347 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b"} err="failed to get container status \"c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b\": rpc error: code = NotFound desc = could not find container \"c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b\": container with ID starting with c96cde8ea22abb1ef62a3daf018ed08f4dfafd1c5f77f1c259692a80d5f9ad0b not found: ID does not exist" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.198394 4856 scope.go:117] "RemoveContainer" containerID="453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882" Mar 20 14:01:38 crc kubenswrapper[4856]: E0320 14:01:38.198939 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882\": container with ID starting with 453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882 not found: ID does not exist" containerID="453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882" Mar 20 14:01:38 crc kubenswrapper[4856]: I0320 14:01:38.199060 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882"} err="failed to get container status \"453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882\": rpc error: code = NotFound desc = could not find container \"453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882\": container with ID starting with 453ffba782537210ef4a33aeb717b646a8ddbe2d5723c5f27da51fbf63dad882 not found: ID does not exist" Mar 20 14:01:39 crc kubenswrapper[4856]: I0320 14:01:39.830720 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" path="/var/lib/kubelet/pods/ae891c4b-5670-4a45-9484-f1e5e70a7553/volumes" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.153531 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjgmm"] Mar 20 14:02:00 crc kubenswrapper[4856]: E0320 14:02:00.155234 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="registry-server" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.155250 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="registry-server" Mar 20 14:02:00 crc kubenswrapper[4856]: E0320 14:02:00.155288 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="extract-content" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.155296 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="extract-content" Mar 20 14:02:00 crc kubenswrapper[4856]: E0320 14:02:00.155309 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="extract-utilities" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.155315 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="extract-utilities" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.155451 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae891c4b-5670-4a45-9484-f1e5e70a7553" containerName="registry-server" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.155881 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.158105 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.158428 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.158984 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.171674 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjgmm"] Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.200776 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgcz8\" (UniqueName: \"kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8\") pod \"auto-csr-approver-29566922-qjgmm\" (UID: \"8688ef36-c016-4451-aed2-9da03340c237\") " pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.301429 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgcz8\" (UniqueName: \"kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8\") pod \"auto-csr-approver-29566922-qjgmm\" (UID: \"8688ef36-c016-4451-aed2-9da03340c237\") " pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.329920 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgcz8\" (UniqueName: \"kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8\") pod \"auto-csr-approver-29566922-qjgmm\" (UID: \"8688ef36-c016-4451-aed2-9da03340c237\") " pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.477535 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:00 crc kubenswrapper[4856]: I0320 14:02:00.903692 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjgmm"] Mar 20 14:02:01 crc kubenswrapper[4856]: I0320 14:02:01.300925 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" event={"ID":"8688ef36-c016-4451-aed2-9da03340c237","Type":"ContainerStarted","Data":"9af5ec70304f8437e8a1427a52dce22e7cba5c4ec5a13fd567c41f58b119b56a"} Mar 20 14:02:02 crc kubenswrapper[4856]: I0320 14:02:02.312504 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" event={"ID":"8688ef36-c016-4451-aed2-9da03340c237","Type":"ContainerStarted","Data":"f16bdd978667e6efde3773e3e6d222aaa3f2ffb28961a49ec5830793c4180b65"} Mar 20 14:02:02 crc kubenswrapper[4856]: I0320 14:02:02.340178 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" podStartSLOduration=1.252661637 podStartE2EDuration="2.340159465s" podCreationTimestamp="2026-03-20 14:02:00 +0000 UTC" firstStartedPulling="2026-03-20 14:02:00.909348836 +0000 UTC m=+2335.790374966" lastFinishedPulling="2026-03-20 14:02:01.996846654 +0000 UTC m=+2336.877872794" observedRunningTime="2026-03-20 14:02:02.332957528 +0000 UTC m=+2337.213983668" watchObservedRunningTime="2026-03-20 14:02:02.340159465 +0000 UTC m=+2337.221185605" Mar 20 14:02:03 crc kubenswrapper[4856]: I0320 14:02:03.323789 4856 generic.go:334] "Generic (PLEG): container finished" podID="8688ef36-c016-4451-aed2-9da03340c237" containerID="f16bdd978667e6efde3773e3e6d222aaa3f2ffb28961a49ec5830793c4180b65" exitCode=0 Mar 20 14:02:03 crc kubenswrapper[4856]: I0320 14:02:03.323857 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" event={"ID":"8688ef36-c016-4451-aed2-9da03340c237","Type":"ContainerDied","Data":"f16bdd978667e6efde3773e3e6d222aaa3f2ffb28961a49ec5830793c4180b65"} Mar 20 14:02:04 crc kubenswrapper[4856]: I0320 14:02:04.633868 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:04 crc kubenswrapper[4856]: I0320 14:02:04.768217 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgcz8\" (UniqueName: \"kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8\") pod \"8688ef36-c016-4451-aed2-9da03340c237\" (UID: \"8688ef36-c016-4451-aed2-9da03340c237\") " Mar 20 14:02:04 crc kubenswrapper[4856]: I0320 14:02:04.774011 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8" (OuterVolumeSpecName: "kube-api-access-bgcz8") pod "8688ef36-c016-4451-aed2-9da03340c237" (UID: "8688ef36-c016-4451-aed2-9da03340c237"). InnerVolumeSpecName "kube-api-access-bgcz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:04 crc kubenswrapper[4856]: I0320 14:02:04.869969 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgcz8\" (UniqueName: \"kubernetes.io/projected/8688ef36-c016-4451-aed2-9da03340c237-kube-api-access-bgcz8\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.347464 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" event={"ID":"8688ef36-c016-4451-aed2-9da03340c237","Type":"ContainerDied","Data":"9af5ec70304f8437e8a1427a52dce22e7cba5c4ec5a13fd567c41f58b119b56a"} Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.347506 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9af5ec70304f8437e8a1427a52dce22e7cba5c4ec5a13fd567c41f58b119b56a" Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.347560 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjgmm" Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.408485 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-zv78r"] Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.412991 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-zv78r"] Mar 20 14:02:05 crc kubenswrapper[4856]: I0320 14:02:05.838521 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f018f4-64b2-4211-aad2-5f3f0865d3c1" path="/var/lib/kubelet/pods/e5f018f4-64b2-4211-aad2-5f3f0865d3c1/volumes" Mar 20 14:02:33 crc kubenswrapper[4856]: I0320 14:02:33.457676 4856 scope.go:117] "RemoveContainer" containerID="603821a4df46497e0fb06e6421765c404890f6e0e5e063fea08d49df33ce7688" Mar 20 14:03:39 crc kubenswrapper[4856]: I0320 14:03:39.987678 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:03:39 crc kubenswrapper[4856]: I0320 14:03:39.988345 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.167244 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t44zb"] Mar 20 14:04:00 crc kubenswrapper[4856]: E0320 14:04:00.168749 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8688ef36-c016-4451-aed2-9da03340c237" containerName="oc" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.168777 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="8688ef36-c016-4451-aed2-9da03340c237" containerName="oc" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.168944 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="8688ef36-c016-4451-aed2-9da03340c237" containerName="oc" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.169427 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.172466 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.173238 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.175943 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.176088 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t44zb"] Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.305937 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kr2n\" (UniqueName: \"kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n\") pod \"auto-csr-approver-29566924-t44zb\" (UID: \"da8dc781-e6f0-4351-9515-eaf71c3d85ef\") " pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.407110 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kr2n\" (UniqueName: \"kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n\") pod \"auto-csr-approver-29566924-t44zb\" (UID: \"da8dc781-e6f0-4351-9515-eaf71c3d85ef\") " pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.426220 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kr2n\" (UniqueName: \"kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n\") pod \"auto-csr-approver-29566924-t44zb\" (UID: \"da8dc781-e6f0-4351-9515-eaf71c3d85ef\") " pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.512091 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:00 crc kubenswrapper[4856]: I0320 14:04:00.944216 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t44zb"] Mar 20 14:04:01 crc kubenswrapper[4856]: I0320 14:04:01.374399 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-t44zb" event={"ID":"da8dc781-e6f0-4351-9515-eaf71c3d85ef","Type":"ContainerStarted","Data":"33ecd8e4ec926c3fd43972a041085b958155f17400ecb256dd06001069fb6daa"} Mar 20 14:04:02 crc kubenswrapper[4856]: I0320 14:04:02.381665 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-t44zb" event={"ID":"da8dc781-e6f0-4351-9515-eaf71c3d85ef","Type":"ContainerStarted","Data":"056de529056419c4ece5d91e19853803cc0e5bd8baeb1a012d6d9d58f8fd45ff"} Mar 20 14:04:02 crc kubenswrapper[4856]: I0320 14:04:02.398409 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566924-t44zb" podStartSLOduration=1.3354139090000001 podStartE2EDuration="2.398386416s" podCreationTimestamp="2026-03-20 14:04:00 +0000 UTC" firstStartedPulling="2026-03-20 14:04:00.956797443 +0000 UTC m=+2455.837823583" lastFinishedPulling="2026-03-20 14:04:02.01976995 +0000 UTC m=+2456.900796090" observedRunningTime="2026-03-20 14:04:02.396578237 +0000 UTC m=+2457.277604377" watchObservedRunningTime="2026-03-20 14:04:02.398386416 +0000 UTC m=+2457.279412546" Mar 20 14:04:03 crc kubenswrapper[4856]: I0320 14:04:03.393294 4856 generic.go:334] "Generic (PLEG): container finished" podID="da8dc781-e6f0-4351-9515-eaf71c3d85ef" containerID="056de529056419c4ece5d91e19853803cc0e5bd8baeb1a012d6d9d58f8fd45ff" exitCode=0 Mar 20 14:04:03 crc kubenswrapper[4856]: I0320 14:04:03.393401 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-t44zb" event={"ID":"da8dc781-e6f0-4351-9515-eaf71c3d85ef","Type":"ContainerDied","Data":"056de529056419c4ece5d91e19853803cc0e5bd8baeb1a012d6d9d58f8fd45ff"} Mar 20 14:04:04 crc kubenswrapper[4856]: I0320 14:04:04.677974 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:04 crc kubenswrapper[4856]: I0320 14:04:04.774486 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kr2n\" (UniqueName: \"kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n\") pod \"da8dc781-e6f0-4351-9515-eaf71c3d85ef\" (UID: \"da8dc781-e6f0-4351-9515-eaf71c3d85ef\") " Mar 20 14:04:04 crc kubenswrapper[4856]: I0320 14:04:04.781179 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n" (OuterVolumeSpecName: "kube-api-access-7kr2n") pod "da8dc781-e6f0-4351-9515-eaf71c3d85ef" (UID: "da8dc781-e6f0-4351-9515-eaf71c3d85ef"). InnerVolumeSpecName "kube-api-access-7kr2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:04:04 crc kubenswrapper[4856]: I0320 14:04:04.877449 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kr2n\" (UniqueName: \"kubernetes.io/projected/da8dc781-e6f0-4351-9515-eaf71c3d85ef-kube-api-access-7kr2n\") on node \"crc\" DevicePath \"\"" Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.411898 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-t44zb" event={"ID":"da8dc781-e6f0-4351-9515-eaf71c3d85ef","Type":"ContainerDied","Data":"33ecd8e4ec926c3fd43972a041085b958155f17400ecb256dd06001069fb6daa"} Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.411935 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ecd8e4ec926c3fd43972a041085b958155f17400ecb256dd06001069fb6daa" Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.411957 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t44zb" Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.473055 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-mvnqw"] Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.479369 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-mvnqw"] Mar 20 14:04:05 crc kubenswrapper[4856]: I0320 14:04:05.834100 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9f4824-1cd5-46a8-93b4-7b733d06878b" path="/var/lib/kubelet/pods/cc9f4824-1cd5-46a8-93b4-7b733d06878b/volumes" Mar 20 14:04:09 crc kubenswrapper[4856]: I0320 14:04:09.987642 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:04:09 crc kubenswrapper[4856]: I0320 14:04:09.988223 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:04:33 crc kubenswrapper[4856]: I0320 14:04:33.592352 4856 scope.go:117] "RemoveContainer" containerID="cddf6da8cd436a3fce4b8322960949fb844071ad7e4341bac5f4c52952c60a23" Mar 20 14:04:39 crc kubenswrapper[4856]: I0320 14:04:39.987652 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:04:39 crc kubenswrapper[4856]: I0320 14:04:39.988580 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:04:39 crc kubenswrapper[4856]: I0320 14:04:39.988629 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:04:39 crc kubenswrapper[4856]: I0320 14:04:39.989590 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:04:39 crc kubenswrapper[4856]: I0320 14:04:39.989641 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" gracePeriod=600 Mar 20 14:04:40 crc kubenswrapper[4856]: E0320 14:04:40.127488 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:04:40 crc kubenswrapper[4856]: I0320 14:04:40.683946 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" exitCode=0 Mar 20 14:04:40 crc kubenswrapper[4856]: I0320 14:04:40.684033 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06"} Mar 20 14:04:40 crc kubenswrapper[4856]: I0320 14:04:40.684301 4856 scope.go:117] "RemoveContainer" containerID="ea12624ec23b9a16fd52c435e61ed69c10c7d2e47860ec1cdaa4a9f765b8ce66" Mar 20 14:04:40 crc kubenswrapper[4856]: I0320 14:04:40.685085 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:04:40 crc kubenswrapper[4856]: E0320 14:04:40.685793 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:04:51 crc kubenswrapper[4856]: I0320 14:04:51.820579 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:04:51 crc kubenswrapper[4856]: E0320 14:04:51.821478 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.709667 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:00 crc kubenswrapper[4856]: E0320 14:05:00.711896 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da8dc781-e6f0-4351-9515-eaf71c3d85ef" containerName="oc" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.712016 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="da8dc781-e6f0-4351-9515-eaf71c3d85ef" containerName="oc" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.712308 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="da8dc781-e6f0-4351-9515-eaf71c3d85ef" containerName="oc" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.713686 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.722520 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.818026 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.818391 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkfq\" (UniqueName: \"kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.818524 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.919991 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkfq\" (UniqueName: \"kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.920079 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.920108 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.920710 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.920982 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:00 crc kubenswrapper[4856]: I0320 14:05:00.941237 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkfq\" (UniqueName: \"kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq\") pod \"community-operators-b445h\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.089362 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.593250 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:01 crc kubenswrapper[4856]: W0320 14:05:01.594945 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcf5a98f_c0da_481b_8a2c_d8dfb0f6a330.slice/crio-68eba390f305274be00ee5d7a8b83feae88632deae50e41be8e5ffeae2b4628b WatchSource:0}: Error finding container 68eba390f305274be00ee5d7a8b83feae88632deae50e41be8e5ffeae2b4628b: Status 404 returned error can't find the container with id 68eba390f305274be00ee5d7a8b83feae88632deae50e41be8e5ffeae2b4628b Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.878223 4856 generic.go:334] "Generic (PLEG): container finished" podID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerID="83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f" exitCode=0 Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.878321 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerDied","Data":"83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f"} Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.878383 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerStarted","Data":"68eba390f305274be00ee5d7a8b83feae88632deae50e41be8e5ffeae2b4628b"} Mar 20 14:05:01 crc kubenswrapper[4856]: I0320 14:05:01.881668 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:05:03 crc kubenswrapper[4856]: I0320 14:05:03.892599 4856 generic.go:334] "Generic (PLEG): container finished" podID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerID="a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089" exitCode=0 Mar 20 14:05:03 crc kubenswrapper[4856]: I0320 14:05:03.892683 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerDied","Data":"a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089"} Mar 20 14:05:04 crc kubenswrapper[4856]: I0320 14:05:04.902403 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerStarted","Data":"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0"} Mar 20 14:05:04 crc kubenswrapper[4856]: I0320 14:05:04.931790 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b445h" podStartSLOduration=2.478771209 podStartE2EDuration="4.931766172s" podCreationTimestamp="2026-03-20 14:05:00 +0000 UTC" firstStartedPulling="2026-03-20 14:05:01.881047167 +0000 UTC m=+2516.762073307" lastFinishedPulling="2026-03-20 14:05:04.33404212 +0000 UTC m=+2519.215068270" observedRunningTime="2026-03-20 14:05:04.924794269 +0000 UTC m=+2519.805820409" watchObservedRunningTime="2026-03-20 14:05:04.931766172 +0000 UTC m=+2519.812792302" Mar 20 14:05:05 crc kubenswrapper[4856]: I0320 14:05:05.824095 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:05:05 crc kubenswrapper[4856]: E0320 14:05:05.825573 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:05:11 crc kubenswrapper[4856]: I0320 14:05:11.089991 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:11 crc kubenswrapper[4856]: I0320 14:05:11.091130 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:11 crc kubenswrapper[4856]: I0320 14:05:11.166533 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:12 crc kubenswrapper[4856]: I0320 14:05:12.005399 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:12 crc kubenswrapper[4856]: I0320 14:05:12.062568 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:13 crc kubenswrapper[4856]: I0320 14:05:13.974742 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b445h" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="registry-server" containerID="cri-o://40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0" gracePeriod=2 Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.973546 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.980348 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkfq\" (UniqueName: \"kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq\") pod \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.980483 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content\") pod \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.984145 4856 generic.go:334] "Generic (PLEG): container finished" podID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerID="40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0" exitCode=0 Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.984195 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerDied","Data":"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0"} Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.984226 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b445h" event={"ID":"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330","Type":"ContainerDied","Data":"68eba390f305274be00ee5d7a8b83feae88632deae50e41be8e5ffeae2b4628b"} Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.984255 4856 scope.go:117] "RemoveContainer" containerID="40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0" Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.984360 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b445h" Mar 20 14:05:14 crc kubenswrapper[4856]: I0320 14:05:14.988674 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq" (OuterVolumeSpecName: "kube-api-access-6dkfq") pod "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" (UID: "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330"). InnerVolumeSpecName "kube-api-access-6dkfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.026740 4856 scope.go:117] "RemoveContainer" containerID="a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.046460 4856 scope.go:117] "RemoveContainer" containerID="83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.063250 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" (UID: "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.073477 4856 scope.go:117] "RemoveContainer" containerID="40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0" Mar 20 14:05:15 crc kubenswrapper[4856]: E0320 14:05:15.074111 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0\": container with ID starting with 40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0 not found: ID does not exist" containerID="40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.074148 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0"} err="failed to get container status \"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0\": rpc error: code = NotFound desc = could not find container \"40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0\": container with ID starting with 40f2bc6052f5d17cc9e77951c16a501cdc7b3f7cd6a999b383401d60ea368ba0 not found: ID does not exist" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.074173 4856 scope.go:117] "RemoveContainer" containerID="a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089" Mar 20 14:05:15 crc kubenswrapper[4856]: E0320 14:05:15.074466 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089\": container with ID starting with a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089 not found: ID does not exist" containerID="a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.074491 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089"} err="failed to get container status \"a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089\": rpc error: code = NotFound desc = could not find container \"a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089\": container with ID starting with a25e88e013f88e77266e4cce9ed127828fd1b0af0e71e4592451eda4a6f74089 not found: ID does not exist" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.074509 4856 scope.go:117] "RemoveContainer" containerID="83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f" Mar 20 14:05:15 crc kubenswrapper[4856]: E0320 14:05:15.074851 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f\": container with ID starting with 83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f not found: ID does not exist" containerID="83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.074876 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f"} err="failed to get container status \"83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f\": rpc error: code = NotFound desc = could not find container \"83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f\": container with ID starting with 83c15dc424390b2f69304ca57655dd3f8cfce1f363df1f52f2636ed58533694f not found: ID does not exist" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.081640 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities\") pod \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\" (UID: \"fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330\") " Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.081945 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkfq\" (UniqueName: \"kubernetes.io/projected/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-kube-api-access-6dkfq\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.081968 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.082481 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities" (OuterVolumeSpecName: "utilities") pod "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" (UID: "fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.183919 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.334932 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.345789 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b445h"] Mar 20 14:05:15 crc kubenswrapper[4856]: I0320 14:05:15.831555 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" path="/var/lib/kubelet/pods/fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330/volumes" Mar 20 14:05:20 crc kubenswrapper[4856]: I0320 14:05:20.819671 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:05:20 crc kubenswrapper[4856]: E0320 14:05:20.820170 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:05:33 crc kubenswrapper[4856]: I0320 14:05:33.820346 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:05:33 crc kubenswrapper[4856]: E0320 14:05:33.821081 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:05:48 crc kubenswrapper[4856]: I0320 14:05:48.820402 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:05:48 crc kubenswrapper[4856]: E0320 14:05:48.821827 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.709633 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5xl9m"] Mar 20 14:05:57 crc kubenswrapper[4856]: E0320 14:05:57.710582 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="registry-server" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.710603 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="registry-server" Mar 20 14:05:57 crc kubenswrapper[4856]: E0320 14:05:57.710618 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="extract-content" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.710625 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="extract-content" Mar 20 14:05:57 crc kubenswrapper[4856]: E0320 14:05:57.710648 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="extract-utilities" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.710654 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="extract-utilities" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.710794 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcf5a98f-c0da-481b-8a2c-d8dfb0f6a330" containerName="registry-server" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.711954 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.733844 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xl9m"] Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.844188 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vg5\" (UniqueName: \"kubernetes.io/projected/3e12430a-718f-4734-bfa0-1e4fd5d46b38-kube-api-access-49vg5\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.844484 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-catalog-content\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.844536 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-utilities\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.946492 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-catalog-content\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.946538 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-utilities\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.946569 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vg5\" (UniqueName: \"kubernetes.io/projected/3e12430a-718f-4734-bfa0-1e4fd5d46b38-kube-api-access-49vg5\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.947083 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-catalog-content\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.947129 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3e12430a-718f-4734-bfa0-1e4fd5d46b38-utilities\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:57 crc kubenswrapper[4856]: I0320 14:05:57.968986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vg5\" (UniqueName: \"kubernetes.io/projected/3e12430a-718f-4734-bfa0-1e4fd5d46b38-kube-api-access-49vg5\") pod \"certified-operators-5xl9m\" (UID: \"3e12430a-718f-4734-bfa0-1e4fd5d46b38\") " pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:58 crc kubenswrapper[4856]: I0320 14:05:58.047786 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:05:58 crc kubenswrapper[4856]: I0320 14:05:58.550258 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xl9m"] Mar 20 14:05:59 crc kubenswrapper[4856]: I0320 14:05:59.338160 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e12430a-718f-4734-bfa0-1e4fd5d46b38" containerID="eabd2b4e03f3f242290a5ba09312a6e7b2c1365c422bfb0ae536a9afe3161935" exitCode=0 Mar 20 14:05:59 crc kubenswrapper[4856]: I0320 14:05:59.338236 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xl9m" event={"ID":"3e12430a-718f-4734-bfa0-1e4fd5d46b38","Type":"ContainerDied","Data":"eabd2b4e03f3f242290a5ba09312a6e7b2c1365c422bfb0ae536a9afe3161935"} Mar 20 14:05:59 crc kubenswrapper[4856]: I0320 14:05:59.339168 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xl9m" event={"ID":"3e12430a-718f-4734-bfa0-1e4fd5d46b38","Type":"ContainerStarted","Data":"0650119fd5713962b16729e1abe4fdadc7f11a5ed3af327973c8a62f4c4b7c3b"} Mar 20 14:05:59 crc kubenswrapper[4856]: I0320 14:05:59.821584 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:05:59 crc kubenswrapper[4856]: E0320 14:05:59.822644 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.150231 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566926-tjnc8"] Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.151905 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.154832 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.154868 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.155786 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.160422 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-tjnc8"] Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.281851 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctfq\" (UniqueName: \"kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq\") pod \"auto-csr-approver-29566926-tjnc8\" (UID: \"4173d4f2-3326-4e76-af8c-ce72d8e22178\") " pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.384089 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctfq\" (UniqueName: \"kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq\") pod \"auto-csr-approver-29566926-tjnc8\" (UID: \"4173d4f2-3326-4e76-af8c-ce72d8e22178\") " pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.405947 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctfq\" (UniqueName: \"kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq\") pod \"auto-csr-approver-29566926-tjnc8\" (UID: \"4173d4f2-3326-4e76-af8c-ce72d8e22178\") " pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.476946 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:00 crc kubenswrapper[4856]: I0320 14:06:00.891696 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-tjnc8"] Mar 20 14:06:01 crc kubenswrapper[4856]: I0320 14:06:01.367824 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" event={"ID":"4173d4f2-3326-4e76-af8c-ce72d8e22178","Type":"ContainerStarted","Data":"6564e284c20e67f01123a08b6e2dc7589c957966babeaa8e7ded4a4226583370"} Mar 20 14:06:03 crc kubenswrapper[4856]: I0320 14:06:03.396120 4856 generic.go:334] "Generic (PLEG): container finished" podID="3e12430a-718f-4734-bfa0-1e4fd5d46b38" containerID="969f62135912a9e551bd19ba57e7b196a7074c565b433fa7619d3d4b83c9e755" exitCode=0 Mar 20 14:06:03 crc kubenswrapper[4856]: I0320 14:06:03.396164 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xl9m" event={"ID":"3e12430a-718f-4734-bfa0-1e4fd5d46b38","Type":"ContainerDied","Data":"969f62135912a9e551bd19ba57e7b196a7074c565b433fa7619d3d4b83c9e755"} Mar 20 14:06:03 crc kubenswrapper[4856]: I0320 14:06:03.398354 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" event={"ID":"4173d4f2-3326-4e76-af8c-ce72d8e22178","Type":"ContainerStarted","Data":"15338ad9e9be6eca49c1e793099206030641b9cb4e4d2fe53c62a00fee7ac814"} Mar 20 14:06:03 crc kubenswrapper[4856]: I0320 14:06:03.438232 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" podStartSLOduration=1.426354457 podStartE2EDuration="3.438206554s" podCreationTimestamp="2026-03-20 14:06:00 +0000 UTC" firstStartedPulling="2026-03-20 14:06:00.903786091 +0000 UTC m=+2575.784812221" lastFinishedPulling="2026-03-20 14:06:02.915638158 +0000 UTC m=+2577.796664318" observedRunningTime="2026-03-20 14:06:03.432384102 +0000 UTC m=+2578.313410242" watchObservedRunningTime="2026-03-20 14:06:03.438206554 +0000 UTC m=+2578.319232684" Mar 20 14:06:04 crc kubenswrapper[4856]: I0320 14:06:04.408843 4856 generic.go:334] "Generic (PLEG): container finished" podID="4173d4f2-3326-4e76-af8c-ce72d8e22178" containerID="15338ad9e9be6eca49c1e793099206030641b9cb4e4d2fe53c62a00fee7ac814" exitCode=0 Mar 20 14:06:04 crc kubenswrapper[4856]: I0320 14:06:04.408906 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" event={"ID":"4173d4f2-3326-4e76-af8c-ce72d8e22178","Type":"ContainerDied","Data":"15338ad9e9be6eca49c1e793099206030641b9cb4e4d2fe53c62a00fee7ac814"} Mar 20 14:06:04 crc kubenswrapper[4856]: I0320 14:06:04.411456 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5xl9m" event={"ID":"3e12430a-718f-4734-bfa0-1e4fd5d46b38","Type":"ContainerStarted","Data":"b62a2866e9a4e6bc404ba69626744f96a4c63d3d897f636ba7e52cdc1d2a7e42"} Mar 20 14:06:04 crc kubenswrapper[4856]: I0320 14:06:04.453978 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5xl9m" podStartSLOduration=2.752731357 podStartE2EDuration="7.453958309s" podCreationTimestamp="2026-03-20 14:05:57 +0000 UTC" firstStartedPulling="2026-03-20 14:05:59.341129523 +0000 UTC m=+2574.222155693" lastFinishedPulling="2026-03-20 14:06:04.042356515 +0000 UTC m=+2578.923382645" observedRunningTime="2026-03-20 14:06:04.448074896 +0000 UTC m=+2579.329101036" watchObservedRunningTime="2026-03-20 14:06:04.453958309 +0000 UTC m=+2579.334984459" Mar 20 14:06:05 crc kubenswrapper[4856]: I0320 14:06:05.723142 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:05 crc kubenswrapper[4856]: I0320 14:06:05.862553 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ctfq\" (UniqueName: \"kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq\") pod \"4173d4f2-3326-4e76-af8c-ce72d8e22178\" (UID: \"4173d4f2-3326-4e76-af8c-ce72d8e22178\") " Mar 20 14:06:05 crc kubenswrapper[4856]: I0320 14:06:05.868479 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq" (OuterVolumeSpecName: "kube-api-access-4ctfq") pod "4173d4f2-3326-4e76-af8c-ce72d8e22178" (UID: "4173d4f2-3326-4e76-af8c-ce72d8e22178"). InnerVolumeSpecName "kube-api-access-4ctfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:05 crc kubenswrapper[4856]: I0320 14:06:05.964786 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ctfq\" (UniqueName: \"kubernetes.io/projected/4173d4f2-3326-4e76-af8c-ce72d8e22178-kube-api-access-4ctfq\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:06 crc kubenswrapper[4856]: I0320 14:06:06.429620 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" event={"ID":"4173d4f2-3326-4e76-af8c-ce72d8e22178","Type":"ContainerDied","Data":"6564e284c20e67f01123a08b6e2dc7589c957966babeaa8e7ded4a4226583370"} Mar 20 14:06:06 crc kubenswrapper[4856]: I0320 14:06:06.429701 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-tjnc8" Mar 20 14:06:06 crc kubenswrapper[4856]: I0320 14:06:06.429682 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6564e284c20e67f01123a08b6e2dc7589c957966babeaa8e7ded4a4226583370" Mar 20 14:06:06 crc kubenswrapper[4856]: I0320 14:06:06.787545 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-jg6xl"] Mar 20 14:06:06 crc kubenswrapper[4856]: I0320 14:06:06.793972 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-jg6xl"] Mar 20 14:06:07 crc kubenswrapper[4856]: I0320 14:06:07.834761 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97624a1-8a30-4c93-8fc8-7995e78480ae" path="/var/lib/kubelet/pods/a97624a1-8a30-4c93-8fc8-7995e78480ae/volumes" Mar 20 14:06:08 crc kubenswrapper[4856]: I0320 14:06:08.048115 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:06:08 crc kubenswrapper[4856]: I0320 14:06:08.048571 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:06:08 crc kubenswrapper[4856]: I0320 14:06:08.109708 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:06:09 crc kubenswrapper[4856]: I0320 14:06:09.508154 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5xl9m" Mar 20 14:06:10 crc kubenswrapper[4856]: I0320 14:06:10.722115 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5xl9m"] Mar 20 14:06:10 crc kubenswrapper[4856]: I0320 14:06:10.868350 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 14:06:11 crc kubenswrapper[4856]: I0320 14:06:11.469130 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-x87hg" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="registry-server" containerID="cri-o://d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77" gracePeriod=2 Mar 20 14:06:11 crc kubenswrapper[4856]: I0320 14:06:11.879503 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.058720 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content\") pod \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.058798 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58n4g\" (UniqueName: \"kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g\") pod \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.058873 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities\") pod \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\" (UID: \"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357\") " Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.059407 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities" (OuterVolumeSpecName: "utilities") pod "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" (UID: "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.065628 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g" (OuterVolumeSpecName: "kube-api-access-58n4g") pod "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" (UID: "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357"). InnerVolumeSpecName "kube-api-access-58n4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.109644 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" (UID: "3dd9f3dc-ba0f-46c3-936d-44ff5adb2357"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.160101 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58n4g\" (UniqueName: \"kubernetes.io/projected/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-kube-api-access-58n4g\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.160145 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.160156 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.477572 4856 generic.go:334] "Generic (PLEG): container finished" podID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerID="d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77" exitCode=0 Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.477612 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerDied","Data":"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77"} Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.477667 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-x87hg" event={"ID":"3dd9f3dc-ba0f-46c3-936d-44ff5adb2357","Type":"ContainerDied","Data":"c91a79769a729ced71109e017895bb7b6c28a86509f9cbe42a4c7386e57a0f4a"} Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.477690 4856 scope.go:117] "RemoveContainer" containerID="d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.477683 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-x87hg" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.496937 4856 scope.go:117] "RemoveContainer" containerID="e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.522329 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.532542 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-x87hg"] Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.532730 4856 scope.go:117] "RemoveContainer" containerID="95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.556278 4856 scope.go:117] "RemoveContainer" containerID="d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77" Mar 20 14:06:12 crc kubenswrapper[4856]: E0320 14:06:12.556678 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77\": container with ID starting with d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77 not found: ID does not exist" containerID="d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.556711 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77"} err="failed to get container status \"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77\": rpc error: code = NotFound desc = could not find container \"d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77\": container with ID starting with d2596c487d246c1992e17402e1344504c1185b400ecbe58c93bafad93f13db77 not found: ID does not exist" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.556731 4856 scope.go:117] "RemoveContainer" containerID="e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa" Mar 20 14:06:12 crc kubenswrapper[4856]: E0320 14:06:12.557096 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa\": container with ID starting with e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa not found: ID does not exist" containerID="e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.557123 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa"} err="failed to get container status \"e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa\": rpc error: code = NotFound desc = could not find container \"e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa\": container with ID starting with e53ed2393dbcddc0a497b1b879bf575c7e4979afbe05e3a1c9e62835b2720efa not found: ID does not exist" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.557137 4856 scope.go:117] "RemoveContainer" containerID="95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9" Mar 20 14:06:12 crc kubenswrapper[4856]: E0320 14:06:12.557412 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9\": container with ID starting with 95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9 not found: ID does not exist" containerID="95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9" Mar 20 14:06:12 crc kubenswrapper[4856]: I0320 14:06:12.557432 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9"} err="failed to get container status \"95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9\": rpc error: code = NotFound desc = could not find container \"95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9\": container with ID starting with 95fa48aa8f0ea2ba6a7135f9c52a1141bd613bbb3e1851d45d76d5a5d596dfa9 not found: ID does not exist" Mar 20 14:06:13 crc kubenswrapper[4856]: I0320 14:06:13.825209 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:06:13 crc kubenswrapper[4856]: E0320 14:06:13.825484 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:06:13 crc kubenswrapper[4856]: I0320 14:06:13.832033 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" path="/var/lib/kubelet/pods/3dd9f3dc-ba0f-46c3-936d-44ff5adb2357/volumes" Mar 20 14:06:27 crc kubenswrapper[4856]: I0320 14:06:27.819758 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:06:27 crc kubenswrapper[4856]: E0320 14:06:27.820599 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:06:33 crc kubenswrapper[4856]: I0320 14:06:33.706228 4856 scope.go:117] "RemoveContainer" containerID="0f18f686dd1805405b7f3d9136d35be1af26594a8d887aeebd7dba8fc9bc1c78" Mar 20 14:06:38 crc kubenswrapper[4856]: I0320 14:06:38.820681 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:06:38 crc kubenswrapper[4856]: E0320 14:06:38.821632 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:06:53 crc kubenswrapper[4856]: I0320 14:06:53.820791 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:06:53 crc kubenswrapper[4856]: E0320 14:06:53.821804 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:07:08 crc kubenswrapper[4856]: I0320 14:07:08.820527 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:07:08 crc kubenswrapper[4856]: E0320 14:07:08.821541 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:07:19 crc kubenswrapper[4856]: I0320 14:07:19.819762 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:07:19 crc kubenswrapper[4856]: E0320 14:07:19.820850 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:07:34 crc kubenswrapper[4856]: I0320 14:07:34.819315 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:07:34 crc kubenswrapper[4856]: E0320 14:07:34.820143 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:07:47 crc kubenswrapper[4856]: I0320 14:07:47.819638 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:07:47 crc kubenswrapper[4856]: E0320 14:07:47.820489 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.142731 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566928-g6rkg"] Mar 20 14:08:00 crc kubenswrapper[4856]: E0320 14:08:00.143688 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143703 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4856]: E0320 14:08:00.143719 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4173d4f2-3326-4e76-af8c-ce72d8e22178" containerName="oc" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143725 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="4173d4f2-3326-4e76-af8c-ce72d8e22178" containerName="oc" Mar 20 14:08:00 crc kubenswrapper[4856]: E0320 14:08:00.143734 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143741 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="extract-utilities" Mar 20 14:08:00 crc kubenswrapper[4856]: E0320 14:08:00.143748 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143754 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="extract-content" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143904 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd9f3dc-ba0f-46c3-936d-44ff5adb2357" containerName="registry-server" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.143926 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="4173d4f2-3326-4e76-af8c-ce72d8e22178" containerName="oc" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.144417 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.146706 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.147006 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.147974 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.159711 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-g6rkg"] Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.268165 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dpll\" (UniqueName: \"kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll\") pod \"auto-csr-approver-29566928-g6rkg\" (UID: \"00c0ddb9-920c-4abc-8904-d93bb72e8f9e\") " pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.369552 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dpll\" (UniqueName: \"kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll\") pod \"auto-csr-approver-29566928-g6rkg\" (UID: \"00c0ddb9-920c-4abc-8904-d93bb72e8f9e\") " pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.395374 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dpll\" (UniqueName: \"kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll\") pod \"auto-csr-approver-29566928-g6rkg\" (UID: \"00c0ddb9-920c-4abc-8904-d93bb72e8f9e\") " pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.470590 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:00 crc kubenswrapper[4856]: I0320 14:08:00.704456 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-g6rkg"] Mar 20 14:08:01 crc kubenswrapper[4856]: I0320 14:08:01.341329 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" event={"ID":"00c0ddb9-920c-4abc-8904-d93bb72e8f9e","Type":"ContainerStarted","Data":"ca4a67c904f8f47e717b169d47722a58f65fd7a2071279670b6db06e17a05f01"} Mar 20 14:08:01 crc kubenswrapper[4856]: I0320 14:08:01.819777 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:08:01 crc kubenswrapper[4856]: E0320 14:08:01.820063 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:08:02 crc kubenswrapper[4856]: I0320 14:08:02.348822 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" event={"ID":"00c0ddb9-920c-4abc-8904-d93bb72e8f9e","Type":"ContainerStarted","Data":"13843ee995d50c5d17f1eb55b52bcc88437127b4bbdc60100253fe435dd90a6b"} Mar 20 14:08:03 crc kubenswrapper[4856]: I0320 14:08:03.356975 4856 generic.go:334] "Generic (PLEG): container finished" podID="00c0ddb9-920c-4abc-8904-d93bb72e8f9e" containerID="13843ee995d50c5d17f1eb55b52bcc88437127b4bbdc60100253fe435dd90a6b" exitCode=0 Mar 20 14:08:03 crc kubenswrapper[4856]: I0320 14:08:03.357012 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" event={"ID":"00c0ddb9-920c-4abc-8904-d93bb72e8f9e","Type":"ContainerDied","Data":"13843ee995d50c5d17f1eb55b52bcc88437127b4bbdc60100253fe435dd90a6b"} Mar 20 14:08:04 crc kubenswrapper[4856]: I0320 14:08:04.668015 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:04 crc kubenswrapper[4856]: I0320 14:08:04.837443 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dpll\" (UniqueName: \"kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll\") pod \"00c0ddb9-920c-4abc-8904-d93bb72e8f9e\" (UID: \"00c0ddb9-920c-4abc-8904-d93bb72e8f9e\") " Mar 20 14:08:04 crc kubenswrapper[4856]: I0320 14:08:04.844340 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll" (OuterVolumeSpecName: "kube-api-access-9dpll") pod "00c0ddb9-920c-4abc-8904-d93bb72e8f9e" (UID: "00c0ddb9-920c-4abc-8904-d93bb72e8f9e"). InnerVolumeSpecName "kube-api-access-9dpll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:08:04 crc kubenswrapper[4856]: I0320 14:08:04.939045 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dpll\" (UniqueName: \"kubernetes.io/projected/00c0ddb9-920c-4abc-8904-d93bb72e8f9e-kube-api-access-9dpll\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.376653 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" event={"ID":"00c0ddb9-920c-4abc-8904-d93bb72e8f9e","Type":"ContainerDied","Data":"ca4a67c904f8f47e717b169d47722a58f65fd7a2071279670b6db06e17a05f01"} Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.376711 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca4a67c904f8f47e717b169d47722a58f65fd7a2071279670b6db06e17a05f01" Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.376738 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-g6rkg" Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.435612 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjgmm"] Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.442734 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjgmm"] Mar 20 14:08:05 crc kubenswrapper[4856]: I0320 14:08:05.841529 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8688ef36-c016-4451-aed2-9da03340c237" path="/var/lib/kubelet/pods/8688ef36-c016-4451-aed2-9da03340c237/volumes" Mar 20 14:08:14 crc kubenswrapper[4856]: I0320 14:08:14.820546 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:08:14 crc kubenswrapper[4856]: E0320 14:08:14.823199 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:08:29 crc kubenswrapper[4856]: I0320 14:08:29.819636 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:08:29 crc kubenswrapper[4856]: E0320 14:08:29.820631 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:08:33 crc kubenswrapper[4856]: I0320 14:08:33.830475 4856 scope.go:117] "RemoveContainer" containerID="f16bdd978667e6efde3773e3e6d222aaa3f2ffb28961a49ec5830793c4180b65" Mar 20 14:08:42 crc kubenswrapper[4856]: I0320 14:08:42.820024 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:08:42 crc kubenswrapper[4856]: E0320 14:08:42.821107 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:08:57 crc kubenswrapper[4856]: I0320 14:08:57.820515 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:08:57 crc kubenswrapper[4856]: E0320 14:08:57.821324 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:09:08 crc kubenswrapper[4856]: I0320 14:09:08.820857 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:09:08 crc kubenswrapper[4856]: E0320 14:09:08.821698 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:09:23 crc kubenswrapper[4856]: I0320 14:09:23.819812 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:09:23 crc kubenswrapper[4856]: E0320 14:09:23.820877 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.291260 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:31 crc kubenswrapper[4856]: E0320 14:09:31.294343 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00c0ddb9-920c-4abc-8904-d93bb72e8f9e" containerName="oc" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.294465 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="00c0ddb9-920c-4abc-8904-d93bb72e8f9e" containerName="oc" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.294797 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="00c0ddb9-920c-4abc-8904-d93bb72e8f9e" containerName="oc" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.296020 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.311152 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.429465 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gwhg\" (UniqueName: \"kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.429545 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.429827 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.531526 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gwhg\" (UniqueName: \"kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.531626 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.531726 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.532335 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.532507 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.567408 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gwhg\" (UniqueName: \"kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg\") pod \"redhat-operators-4xjj9\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:31 crc kubenswrapper[4856]: I0320 14:09:31.617759 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:32 crc kubenswrapper[4856]: I0320 14:09:32.124829 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:33 crc kubenswrapper[4856]: I0320 14:09:33.057037 4856 generic.go:334] "Generic (PLEG): container finished" podID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerID="33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa" exitCode=0 Mar 20 14:09:33 crc kubenswrapper[4856]: I0320 14:09:33.057090 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerDied","Data":"33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa"} Mar 20 14:09:33 crc kubenswrapper[4856]: I0320 14:09:33.057447 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerStarted","Data":"b30e320cee1f3293ab41237bb556805c7f45956af734a7b329efe3dfab1fe1b3"} Mar 20 14:09:35 crc kubenswrapper[4856]: I0320 14:09:35.073847 4856 generic.go:334] "Generic (PLEG): container finished" podID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerID="d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28" exitCode=0 Mar 20 14:09:35 crc kubenswrapper[4856]: I0320 14:09:35.074464 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerDied","Data":"d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28"} Mar 20 14:09:35 crc kubenswrapper[4856]: I0320 14:09:35.824793 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:09:35 crc kubenswrapper[4856]: E0320 14:09:35.825462 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:09:36 crc kubenswrapper[4856]: I0320 14:09:36.087508 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerStarted","Data":"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b"} Mar 20 14:09:36 crc kubenswrapper[4856]: I0320 14:09:36.119848 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-4xjj9" podStartSLOduration=2.696902528 podStartE2EDuration="5.119825016s" podCreationTimestamp="2026-03-20 14:09:31 +0000 UTC" firstStartedPulling="2026-03-20 14:09:33.05971164 +0000 UTC m=+2787.940737800" lastFinishedPulling="2026-03-20 14:09:35.482634148 +0000 UTC m=+2790.363660288" observedRunningTime="2026-03-20 14:09:36.115916757 +0000 UTC m=+2790.996942917" watchObservedRunningTime="2026-03-20 14:09:36.119825016 +0000 UTC m=+2791.000851156" Mar 20 14:09:41 crc kubenswrapper[4856]: I0320 14:09:41.617927 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:41 crc kubenswrapper[4856]: I0320 14:09:41.618733 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:42 crc kubenswrapper[4856]: I0320 14:09:42.664354 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-4xjj9" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="registry-server" probeResult="failure" output=< Mar 20 14:09:42 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 14:09:42 crc kubenswrapper[4856]: > Mar 20 14:09:50 crc kubenswrapper[4856]: I0320 14:09:50.820108 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:09:51 crc kubenswrapper[4856]: I0320 14:09:51.196855 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b"} Mar 20 14:09:51 crc kubenswrapper[4856]: I0320 14:09:51.669775 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:51 crc kubenswrapper[4856]: I0320 14:09:51.717036 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:51 crc kubenswrapper[4856]: I0320 14:09:51.908890 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.212690 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-4xjj9" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="registry-server" containerID="cri-o://ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b" gracePeriod=2 Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.573246 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.668170 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gwhg\" (UniqueName: \"kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg\") pod \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.668217 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities\") pod \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.668303 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content\") pod \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\" (UID: \"f3a569cf-8207-4070-bb30-b2c4eeb7be40\") " Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.669113 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities" (OuterVolumeSpecName: "utilities") pod "f3a569cf-8207-4070-bb30-b2c4eeb7be40" (UID: "f3a569cf-8207-4070-bb30-b2c4eeb7be40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.674900 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg" (OuterVolumeSpecName: "kube-api-access-7gwhg") pod "f3a569cf-8207-4070-bb30-b2c4eeb7be40" (UID: "f3a569cf-8207-4070-bb30-b2c4eeb7be40"). InnerVolumeSpecName "kube-api-access-7gwhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.770560 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gwhg\" (UniqueName: \"kubernetes.io/projected/f3a569cf-8207-4070-bb30-b2c4eeb7be40-kube-api-access-7gwhg\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.770604 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.799507 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f3a569cf-8207-4070-bb30-b2c4eeb7be40" (UID: "f3a569cf-8207-4070-bb30-b2c4eeb7be40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4856]: I0320 14:09:53.871624 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f3a569cf-8207-4070-bb30-b2c4eeb7be40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:54 crc kubenswrapper[4856]: E0320 14:09:54.003403 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a569cf_8207_4070_bb30_b2c4eeb7be40.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a569cf_8207_4070_bb30_b2c4eeb7be40.slice/crio-b30e320cee1f3293ab41237bb556805c7f45956af734a7b329efe3dfab1fe1b3\": RecentStats: unable to find data in memory cache]" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.220599 4856 generic.go:334] "Generic (PLEG): container finished" podID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerID="ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b" exitCode=0 Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.220650 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerDied","Data":"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b"} Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.220948 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-4xjj9" event={"ID":"f3a569cf-8207-4070-bb30-b2c4eeb7be40","Type":"ContainerDied","Data":"b30e320cee1f3293ab41237bb556805c7f45956af734a7b329efe3dfab1fe1b3"} Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.220977 4856 scope.go:117] "RemoveContainer" containerID="ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.220680 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-4xjj9" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.245576 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.248098 4856 scope.go:117] "RemoveContainer" containerID="d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.253302 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-4xjj9"] Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.274479 4856 scope.go:117] "RemoveContainer" containerID="33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.292569 4856 scope.go:117] "RemoveContainer" containerID="ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b" Mar 20 14:09:54 crc kubenswrapper[4856]: E0320 14:09:54.292943 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b\": container with ID starting with ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b not found: ID does not exist" containerID="ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.292987 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b"} err="failed to get container status \"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b\": rpc error: code = NotFound desc = could not find container \"ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b\": container with ID starting with ae99582e691cd027d9ec805f5aef4959b72adeacd7779e1b744dd356fcc67b1b not found: ID does not exist" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.293022 4856 scope.go:117] "RemoveContainer" containerID="d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28" Mar 20 14:09:54 crc kubenswrapper[4856]: E0320 14:09:54.293548 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28\": container with ID starting with d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28 not found: ID does not exist" containerID="d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.293576 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28"} err="failed to get container status \"d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28\": rpc error: code = NotFound desc = could not find container \"d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28\": container with ID starting with d48db9a426656065ea719d03f361b64b566b786395ed019763f76bc9202d2c28 not found: ID does not exist" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.293599 4856 scope.go:117] "RemoveContainer" containerID="33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa" Mar 20 14:09:54 crc kubenswrapper[4856]: E0320 14:09:54.293865 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa\": container with ID starting with 33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa not found: ID does not exist" containerID="33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa" Mar 20 14:09:54 crc kubenswrapper[4856]: I0320 14:09:54.293908 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa"} err="failed to get container status \"33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa\": rpc error: code = NotFound desc = could not find container \"33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa\": container with ID starting with 33ee7519f7468ab8f89d157a0d72d8a24335891b12de31ed480b4edc68b45aaa not found: ID does not exist" Mar 20 14:09:55 crc kubenswrapper[4856]: I0320 14:09:55.834462 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" path="/var/lib/kubelet/pods/f3a569cf-8207-4070-bb30-b2c4eeb7be40/volumes" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.139441 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566930-srhjx"] Mar 20 14:10:00 crc kubenswrapper[4856]: E0320 14:10:00.143504 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.143525 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4856]: E0320 14:10:00.143539 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.143547 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4856]: E0320 14:10:00.143564 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.143572 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.143739 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a569cf-8207-4070-bb30-b2c4eeb7be40" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.144338 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.146903 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.148309 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.148651 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.158070 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-srhjx"] Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.262372 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr529\" (UniqueName: \"kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529\") pod \"auto-csr-approver-29566930-srhjx\" (UID: \"f3a798da-a318-4bff-a41d-8e35dff4b66e\") " pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.363952 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr529\" (UniqueName: \"kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529\") pod \"auto-csr-approver-29566930-srhjx\" (UID: \"f3a798da-a318-4bff-a41d-8e35dff4b66e\") " pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.384982 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr529\" (UniqueName: \"kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529\") pod \"auto-csr-approver-29566930-srhjx\" (UID: \"f3a798da-a318-4bff-a41d-8e35dff4b66e\") " pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.491645 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:00 crc kubenswrapper[4856]: I0320 14:10:00.906537 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-srhjx"] Mar 20 14:10:00 crc kubenswrapper[4856]: W0320 14:10:00.907461 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3a798da_a318_4bff_a41d_8e35dff4b66e.slice/crio-96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf WatchSource:0}: Error finding container 96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf: Status 404 returned error can't find the container with id 96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf Mar 20 14:10:01 crc kubenswrapper[4856]: I0320 14:10:01.286580 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-srhjx" event={"ID":"f3a798da-a318-4bff-a41d-8e35dff4b66e","Type":"ContainerStarted","Data":"96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf"} Mar 20 14:10:03 crc kubenswrapper[4856]: I0320 14:10:03.302678 4856 generic.go:334] "Generic (PLEG): container finished" podID="f3a798da-a318-4bff-a41d-8e35dff4b66e" containerID="6de0045c9d42ca9c4e1ee154c89fdbaef2b05bd68a67e4849218b4939c42eeb7" exitCode=0 Mar 20 14:10:03 crc kubenswrapper[4856]: I0320 14:10:03.302744 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-srhjx" event={"ID":"f3a798da-a318-4bff-a41d-8e35dff4b66e","Type":"ContainerDied","Data":"6de0045c9d42ca9c4e1ee154c89fdbaef2b05bd68a67e4849218b4939c42eeb7"} Mar 20 14:10:04 crc kubenswrapper[4856]: I0320 14:10:04.592709 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:04 crc kubenswrapper[4856]: I0320 14:10:04.725185 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr529\" (UniqueName: \"kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529\") pod \"f3a798da-a318-4bff-a41d-8e35dff4b66e\" (UID: \"f3a798da-a318-4bff-a41d-8e35dff4b66e\") " Mar 20 14:10:04 crc kubenswrapper[4856]: I0320 14:10:04.730528 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529" (OuterVolumeSpecName: "kube-api-access-zr529") pod "f3a798da-a318-4bff-a41d-8e35dff4b66e" (UID: "f3a798da-a318-4bff-a41d-8e35dff4b66e"). InnerVolumeSpecName "kube-api-access-zr529". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:04 crc kubenswrapper[4856]: I0320 14:10:04.826648 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr529\" (UniqueName: \"kubernetes.io/projected/f3a798da-a318-4bff-a41d-8e35dff4b66e-kube-api-access-zr529\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.318467 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-srhjx" event={"ID":"f3a798da-a318-4bff-a41d-8e35dff4b66e","Type":"ContainerDied","Data":"96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf"} Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.318840 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96f207c2a58b9087c36dc730cdc4afcd4c56ca5a605113b5d1bbc1977c02edaf" Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.318538 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-srhjx" Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.681184 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t44zb"] Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.688064 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t44zb"] Mar 20 14:10:05 crc kubenswrapper[4856]: I0320 14:10:05.829294 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da8dc781-e6f0-4351-9515-eaf71c3d85ef" path="/var/lib/kubelet/pods/da8dc781-e6f0-4351-9515-eaf71c3d85ef/volumes" Mar 20 14:10:33 crc kubenswrapper[4856]: I0320 14:10:33.926710 4856 scope.go:117] "RemoveContainer" containerID="056de529056419c4ece5d91e19853803cc0e5bd8baeb1a012d6d9d58f8fd45ff" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.148287 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566932-p9bz8"] Mar 20 14:12:00 crc kubenswrapper[4856]: E0320 14:12:00.149209 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3a798da-a318-4bff-a41d-8e35dff4b66e" containerName="oc" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.149223 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3a798da-a318-4bff-a41d-8e35dff4b66e" containerName="oc" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.149860 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3a798da-a318-4bff-a41d-8e35dff4b66e" containerName="oc" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.150480 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.155988 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.156130 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.156286 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.173633 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-p9bz8"] Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.320127 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwjpz\" (UniqueName: \"kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz\") pod \"auto-csr-approver-29566932-p9bz8\" (UID: \"98b813be-bb3c-487f-afa0-98960c34bd31\") " pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.421419 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwjpz\" (UniqueName: \"kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz\") pod \"auto-csr-approver-29566932-p9bz8\" (UID: \"98b813be-bb3c-487f-afa0-98960c34bd31\") " pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.457392 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwjpz\" (UniqueName: \"kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz\") pod \"auto-csr-approver-29566932-p9bz8\" (UID: \"98b813be-bb3c-487f-afa0-98960c34bd31\") " pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.475959 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.908252 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-p9bz8"] Mar 20 14:12:00 crc kubenswrapper[4856]: I0320 14:12:00.924459 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:12:01 crc kubenswrapper[4856]: I0320 14:12:01.522464 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" event={"ID":"98b813be-bb3c-487f-afa0-98960c34bd31","Type":"ContainerStarted","Data":"e8c3b4d7addaab3fa1e967463454bce81f3107c0f6ebdb1530f02e8ecc92e6fb"} Mar 20 14:12:03 crc kubenswrapper[4856]: I0320 14:12:03.540715 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" event={"ID":"98b813be-bb3c-487f-afa0-98960c34bd31","Type":"ContainerStarted","Data":"ecff83741f67618729e6e444a415da8f8b855eeb4fb52f86aad147b9e77ff992"} Mar 20 14:12:03 crc kubenswrapper[4856]: I0320 14:12:03.554422 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" podStartSLOduration=1.253517587 podStartE2EDuration="3.554410047s" podCreationTimestamp="2026-03-20 14:12:00 +0000 UTC" firstStartedPulling="2026-03-20 14:12:00.922254781 +0000 UTC m=+2935.803280961" lastFinishedPulling="2026-03-20 14:12:03.223147281 +0000 UTC m=+2938.104173421" observedRunningTime="2026-03-20 14:12:03.553154613 +0000 UTC m=+2938.434180743" watchObservedRunningTime="2026-03-20 14:12:03.554410047 +0000 UTC m=+2938.435436177" Mar 20 14:12:04 crc kubenswrapper[4856]: I0320 14:12:04.547258 4856 generic.go:334] "Generic (PLEG): container finished" podID="98b813be-bb3c-487f-afa0-98960c34bd31" containerID="ecff83741f67618729e6e444a415da8f8b855eeb4fb52f86aad147b9e77ff992" exitCode=0 Mar 20 14:12:04 crc kubenswrapper[4856]: I0320 14:12:04.547432 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" event={"ID":"98b813be-bb3c-487f-afa0-98960c34bd31","Type":"ContainerDied","Data":"ecff83741f67618729e6e444a415da8f8b855eeb4fb52f86aad147b9e77ff992"} Mar 20 14:12:05 crc kubenswrapper[4856]: I0320 14:12:05.865543 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:05 crc kubenswrapper[4856]: I0320 14:12:05.996969 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwjpz\" (UniqueName: \"kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz\") pod \"98b813be-bb3c-487f-afa0-98960c34bd31\" (UID: \"98b813be-bb3c-487f-afa0-98960c34bd31\") " Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.002739 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz" (OuterVolumeSpecName: "kube-api-access-xwjpz") pod "98b813be-bb3c-487f-afa0-98960c34bd31" (UID: "98b813be-bb3c-487f-afa0-98960c34bd31"). InnerVolumeSpecName "kube-api-access-xwjpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.098636 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwjpz\" (UniqueName: \"kubernetes.io/projected/98b813be-bb3c-487f-afa0-98960c34bd31-kube-api-access-xwjpz\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.566081 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" event={"ID":"98b813be-bb3c-487f-afa0-98960c34bd31","Type":"ContainerDied","Data":"e8c3b4d7addaab3fa1e967463454bce81f3107c0f6ebdb1530f02e8ecc92e6fb"} Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.566140 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8c3b4d7addaab3fa1e967463454bce81f3107c0f6ebdb1530f02e8ecc92e6fb" Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.566305 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-p9bz8" Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.651363 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-tjnc8"] Mar 20 14:12:06 crc kubenswrapper[4856]: I0320 14:12:06.658584 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-tjnc8"] Mar 20 14:12:06 crc kubenswrapper[4856]: E0320 14:12:06.754963 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b813be_bb3c_487f_afa0_98960c34bd31.slice/crio-e8c3b4d7addaab3fa1e967463454bce81f3107c0f6ebdb1530f02e8ecc92e6fb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98b813be_bb3c_487f_afa0_98960c34bd31.slice\": RecentStats: unable to find data in memory cache]" Mar 20 14:12:07 crc kubenswrapper[4856]: I0320 14:12:07.837366 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4173d4f2-3326-4e76-af8c-ce72d8e22178" path="/var/lib/kubelet/pods/4173d4f2-3326-4e76-af8c-ce72d8e22178/volumes" Mar 20 14:12:09 crc kubenswrapper[4856]: I0320 14:12:09.987556 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:12:09 crc kubenswrapper[4856]: I0320 14:12:09.987901 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:12:34 crc kubenswrapper[4856]: I0320 14:12:34.044829 4856 scope.go:117] "RemoveContainer" containerID="15338ad9e9be6eca49c1e793099206030641b9cb4e4d2fe53c62a00fee7ac814" Mar 20 14:12:39 crc kubenswrapper[4856]: I0320 14:12:39.987036 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:12:39 crc kubenswrapper[4856]: I0320 14:12:39.987501 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:13:09 crc kubenswrapper[4856]: I0320 14:13:09.987336 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:13:09 crc kubenswrapper[4856]: I0320 14:13:09.987887 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:13:09 crc kubenswrapper[4856]: I0320 14:13:09.987934 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:13:09 crc kubenswrapper[4856]: I0320 14:13:09.988597 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:13:09 crc kubenswrapper[4856]: I0320 14:13:09.988653 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b" gracePeriod=600 Mar 20 14:13:10 crc kubenswrapper[4856]: I0320 14:13:10.133373 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b" exitCode=0 Mar 20 14:13:10 crc kubenswrapper[4856]: I0320 14:13:10.133589 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b"} Mar 20 14:13:10 crc kubenswrapper[4856]: I0320 14:13:10.133798 4856 scope.go:117] "RemoveContainer" containerID="39dec2fbe969130c78bb1ee495e72c1a4339ef02b3439e7f994f33cbf7a4db06" Mar 20 14:13:11 crc kubenswrapper[4856]: I0320 14:13:11.152153 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d"} Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.592946 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:13:46 crc kubenswrapper[4856]: E0320 14:13:46.594458 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98b813be-bb3c-487f-afa0-98960c34bd31" containerName="oc" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.594491 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="98b813be-bb3c-487f-afa0-98960c34bd31" containerName="oc" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.595029 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="98b813be-bb3c-487f-afa0-98960c34bd31" containerName="oc" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.602199 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.608301 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.763289 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.763349 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m86f\" (UniqueName: \"kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.763400 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.864332 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.864461 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.864511 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m86f\" (UniqueName: \"kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.864962 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.865019 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.883030 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m86f\" (UniqueName: \"kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f\") pod \"redhat-marketplace-ph6vz\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:46 crc kubenswrapper[4856]: I0320 14:13:46.927338 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:47 crc kubenswrapper[4856]: I0320 14:13:47.182582 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:13:47 crc kubenswrapper[4856]: I0320 14:13:47.440736 4856 generic.go:334] "Generic (PLEG): container finished" podID="0fffca2a-4895-4334-a891-4e2da95c8121" containerID="13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc" exitCode=0 Mar 20 14:13:47 crc kubenswrapper[4856]: I0320 14:13:47.440798 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerDied","Data":"13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc"} Mar 20 14:13:47 crc kubenswrapper[4856]: I0320 14:13:47.441173 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerStarted","Data":"4f68f3c30cb348ecc4227799b509bc2a35beefb034eb2e803fc001e65b9b6ec1"} Mar 20 14:13:49 crc kubenswrapper[4856]: I0320 14:13:49.459335 4856 generic.go:334] "Generic (PLEG): container finished" podID="0fffca2a-4895-4334-a891-4e2da95c8121" containerID="c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22" exitCode=0 Mar 20 14:13:49 crc kubenswrapper[4856]: I0320 14:13:49.459526 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerDied","Data":"c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22"} Mar 20 14:13:50 crc kubenswrapper[4856]: I0320 14:13:50.468128 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerStarted","Data":"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352"} Mar 20 14:13:50 crc kubenswrapper[4856]: I0320 14:13:50.488799 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ph6vz" podStartSLOduration=1.9432295050000001 podStartE2EDuration="4.488774136s" podCreationTimestamp="2026-03-20 14:13:46 +0000 UTC" firstStartedPulling="2026-03-20 14:13:47.443138544 +0000 UTC m=+3042.324164684" lastFinishedPulling="2026-03-20 14:13:49.988683175 +0000 UTC m=+3044.869709315" observedRunningTime="2026-03-20 14:13:50.484302054 +0000 UTC m=+3045.365328194" watchObservedRunningTime="2026-03-20 14:13:50.488774136 +0000 UTC m=+3045.369800266" Mar 20 14:13:56 crc kubenswrapper[4856]: I0320 14:13:56.928196 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:56 crc kubenswrapper[4856]: I0320 14:13:56.928759 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:56 crc kubenswrapper[4856]: I0320 14:13:56.977650 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:57 crc kubenswrapper[4856]: I0320 14:13:57.606404 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:13:57 crc kubenswrapper[4856]: I0320 14:13:57.661455 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:13:59 crc kubenswrapper[4856]: I0320 14:13:59.543709 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ph6vz" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="registry-server" containerID="cri-o://3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352" gracePeriod=2 Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.002768 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.107615 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content\") pod \"0fffca2a-4895-4334-a891-4e2da95c8121\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.107687 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities\") pod \"0fffca2a-4895-4334-a891-4e2da95c8121\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.107804 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m86f\" (UniqueName: \"kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f\") pod \"0fffca2a-4895-4334-a891-4e2da95c8121\" (UID: \"0fffca2a-4895-4334-a891-4e2da95c8121\") " Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.108682 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities" (OuterVolumeSpecName: "utilities") pod "0fffca2a-4895-4334-a891-4e2da95c8121" (UID: "0fffca2a-4895-4334-a891-4e2da95c8121"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.114716 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f" (OuterVolumeSpecName: "kube-api-access-4m86f") pod "0fffca2a-4895-4334-a891-4e2da95c8121" (UID: "0fffca2a-4895-4334-a891-4e2da95c8121"). InnerVolumeSpecName "kube-api-access-4m86f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.137958 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fffca2a-4895-4334-a891-4e2da95c8121" (UID: "0fffca2a-4895-4334-a891-4e2da95c8121"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.156219 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566934-pqnwv"] Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.156607 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="registry-server" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.156628 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="registry-server" Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.156651 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="extract-content" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.156658 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="extract-content" Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.156673 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="extract-utilities" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.156680 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="extract-utilities" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.156796 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" containerName="registry-server" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.157215 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.161918 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.162259 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.162572 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.165585 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-pqnwv"] Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.209405 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.209561 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fffca2a-4895-4334-a891-4e2da95c8121-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.209587 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m86f\" (UniqueName: \"kubernetes.io/projected/0fffca2a-4895-4334-a891-4e2da95c8121-kube-api-access-4m86f\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.311533 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7kj\" (UniqueName: \"kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj\") pod \"auto-csr-approver-29566934-pqnwv\" (UID: \"dd5b1f2d-7e51-4c02-8222-36edd975c8a6\") " pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.413925 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7kj\" (UniqueName: \"kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj\") pod \"auto-csr-approver-29566934-pqnwv\" (UID: \"dd5b1f2d-7e51-4c02-8222-36edd975c8a6\") " pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.437641 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7kj\" (UniqueName: \"kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj\") pod \"auto-csr-approver-29566934-pqnwv\" (UID: \"dd5b1f2d-7e51-4c02-8222-36edd975c8a6\") " pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.482657 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.558517 4856 generic.go:334] "Generic (PLEG): container finished" podID="0fffca2a-4895-4334-a891-4e2da95c8121" containerID="3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352" exitCode=0 Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.558580 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerDied","Data":"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352"} Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.558625 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ph6vz" event={"ID":"0fffca2a-4895-4334-a891-4e2da95c8121","Type":"ContainerDied","Data":"4f68f3c30cb348ecc4227799b509bc2a35beefb034eb2e803fc001e65b9b6ec1"} Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.558664 4856 scope.go:117] "RemoveContainer" containerID="3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.558853 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ph6vz" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.579174 4856 scope.go:117] "RemoveContainer" containerID="c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.627950 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.633206 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ph6vz"] Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.635064 4856 scope.go:117] "RemoveContainer" containerID="13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.652557 4856 scope.go:117] "RemoveContainer" containerID="3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352" Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.652966 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352\": container with ID starting with 3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352 not found: ID does not exist" containerID="3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.653035 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352"} err="failed to get container status \"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352\": rpc error: code = NotFound desc = could not find container \"3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352\": container with ID starting with 3f3b15fd860de6ffd8c0ff3dcb603020ea04445751a37e77ef3917c72144c352 not found: ID does not exist" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.653075 4856 scope.go:117] "RemoveContainer" containerID="c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22" Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.653400 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22\": container with ID starting with c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22 not found: ID does not exist" containerID="c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.653435 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22"} err="failed to get container status \"c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22\": rpc error: code = NotFound desc = could not find container \"c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22\": container with ID starting with c3c22df7ea605d0d4b990d4132c6bc5b27376e0fcfc5538cfc6871a3c9eddf22 not found: ID does not exist" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.653457 4856 scope.go:117] "RemoveContainer" containerID="13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc" Mar 20 14:14:00 crc kubenswrapper[4856]: E0320 14:14:00.653721 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc\": container with ID starting with 13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc not found: ID does not exist" containerID="13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.653761 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc"} err="failed to get container status \"13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc\": rpc error: code = NotFound desc = could not find container \"13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc\": container with ID starting with 13511ba7072cb931885f53d593408d3dc1bd644ac0a0c3b9abffd879e5f596bc not found: ID does not exist" Mar 20 14:14:00 crc kubenswrapper[4856]: I0320 14:14:00.938142 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-pqnwv"] Mar 20 14:14:01 crc kubenswrapper[4856]: I0320 14:14:01.571057 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" event={"ID":"dd5b1f2d-7e51-4c02-8222-36edd975c8a6","Type":"ContainerStarted","Data":"77baf2970dfa32d1ec1cccd9b4eb30fe229e2e45d57be3f8a992cd22b927a212"} Mar 20 14:14:01 crc kubenswrapper[4856]: I0320 14:14:01.834604 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fffca2a-4895-4334-a891-4e2da95c8121" path="/var/lib/kubelet/pods/0fffca2a-4895-4334-a891-4e2da95c8121/volumes" Mar 20 14:14:02 crc kubenswrapper[4856]: I0320 14:14:02.585377 4856 generic.go:334] "Generic (PLEG): container finished" podID="dd5b1f2d-7e51-4c02-8222-36edd975c8a6" containerID="5fa5e93bb54791981d5ea566e8d13383ad5ff72df8a6e713ecc1214a81a1a247" exitCode=0 Mar 20 14:14:02 crc kubenswrapper[4856]: I0320 14:14:02.585459 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" event={"ID":"dd5b1f2d-7e51-4c02-8222-36edd975c8a6","Type":"ContainerDied","Data":"5fa5e93bb54791981d5ea566e8d13383ad5ff72df8a6e713ecc1214a81a1a247"} Mar 20 14:14:03 crc kubenswrapper[4856]: I0320 14:14:03.899559 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.065724 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7kj\" (UniqueName: \"kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj\") pod \"dd5b1f2d-7e51-4c02-8222-36edd975c8a6\" (UID: \"dd5b1f2d-7e51-4c02-8222-36edd975c8a6\") " Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.071650 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj" (OuterVolumeSpecName: "kube-api-access-2g7kj") pod "dd5b1f2d-7e51-4c02-8222-36edd975c8a6" (UID: "dd5b1f2d-7e51-4c02-8222-36edd975c8a6"). InnerVolumeSpecName "kube-api-access-2g7kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.166986 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7kj\" (UniqueName: \"kubernetes.io/projected/dd5b1f2d-7e51-4c02-8222-36edd975c8a6-kube-api-access-2g7kj\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.620770 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" event={"ID":"dd5b1f2d-7e51-4c02-8222-36edd975c8a6","Type":"ContainerDied","Data":"77baf2970dfa32d1ec1cccd9b4eb30fe229e2e45d57be3f8a992cd22b927a212"} Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.620839 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77baf2970dfa32d1ec1cccd9b4eb30fe229e2e45d57be3f8a992cd22b927a212" Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.620875 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-pqnwv" Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.969488 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-g6rkg"] Mar 20 14:14:04 crc kubenswrapper[4856]: I0320 14:14:04.976577 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-g6rkg"] Mar 20 14:14:05 crc kubenswrapper[4856]: I0320 14:14:05.830441 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00c0ddb9-920c-4abc-8904-d93bb72e8f9e" path="/var/lib/kubelet/pods/00c0ddb9-920c-4abc-8904-d93bb72e8f9e/volumes" Mar 20 14:14:34 crc kubenswrapper[4856]: I0320 14:14:34.159939 4856 scope.go:117] "RemoveContainer" containerID="13843ee995d50c5d17f1eb55b52bcc88437127b4bbdc60100253fe435dd90a6b" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.152855 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74"] Mar 20 14:15:00 crc kubenswrapper[4856]: E0320 14:15:00.153647 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd5b1f2d-7e51-4c02-8222-36edd975c8a6" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.153661 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd5b1f2d-7e51-4c02-8222-36edd975c8a6" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.153831 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd5b1f2d-7e51-4c02-8222-36edd975c8a6" containerName="oc" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.154389 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.156610 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.156831 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.161183 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74"] Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.345535 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.345612 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tmdl\" (UniqueName: \"kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.345639 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.447324 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.447398 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tmdl\" (UniqueName: \"kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.447425 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.448774 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.457936 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.462915 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tmdl\" (UniqueName: \"kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl\") pod \"collect-profiles-29566935-vpf74\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.474704 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:00 crc kubenswrapper[4856]: I0320 14:15:00.943234 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74"] Mar 20 14:15:01 crc kubenswrapper[4856]: I0320 14:15:01.133372 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" event={"ID":"08470efe-e57d-4003-9148-be1539c7a004","Type":"ContainerStarted","Data":"269eb72aac38c2a1b185f52d341310a83666dd54d1d1a77ed54ccbb649933e99"} Mar 20 14:15:01 crc kubenswrapper[4856]: I0320 14:15:01.133675 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" event={"ID":"08470efe-e57d-4003-9148-be1539c7a004","Type":"ContainerStarted","Data":"20101387cc66ee83d5f6e114b0d50b14271d4c80a2686261a47848557ae46358"} Mar 20 14:15:01 crc kubenswrapper[4856]: I0320 14:15:01.150298 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" podStartSLOduration=1.15025904 podStartE2EDuration="1.15025904s" podCreationTimestamp="2026-03-20 14:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:15:01.14948867 +0000 UTC m=+3116.030514800" watchObservedRunningTime="2026-03-20 14:15:01.15025904 +0000 UTC m=+3116.031285170" Mar 20 14:15:02 crc kubenswrapper[4856]: I0320 14:15:02.157220 4856 generic.go:334] "Generic (PLEG): container finished" podID="08470efe-e57d-4003-9148-be1539c7a004" containerID="269eb72aac38c2a1b185f52d341310a83666dd54d1d1a77ed54ccbb649933e99" exitCode=0 Mar 20 14:15:02 crc kubenswrapper[4856]: I0320 14:15:02.157318 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" event={"ID":"08470efe-e57d-4003-9148-be1539c7a004","Type":"ContainerDied","Data":"269eb72aac38c2a1b185f52d341310a83666dd54d1d1a77ed54ccbb649933e99"} Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.443294 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.592721 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume\") pod \"08470efe-e57d-4003-9148-be1539c7a004\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.592758 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tmdl\" (UniqueName: \"kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl\") pod \"08470efe-e57d-4003-9148-be1539c7a004\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.592880 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume\") pod \"08470efe-e57d-4003-9148-be1539c7a004\" (UID: \"08470efe-e57d-4003-9148-be1539c7a004\") " Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.593677 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume" (OuterVolumeSpecName: "config-volume") pod "08470efe-e57d-4003-9148-be1539c7a004" (UID: "08470efe-e57d-4003-9148-be1539c7a004"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.598184 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "08470efe-e57d-4003-9148-be1539c7a004" (UID: "08470efe-e57d-4003-9148-be1539c7a004"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.599863 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl" (OuterVolumeSpecName: "kube-api-access-8tmdl") pod "08470efe-e57d-4003-9148-be1539c7a004" (UID: "08470efe-e57d-4003-9148-be1539c7a004"). InnerVolumeSpecName "kube-api-access-8tmdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.694087 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/08470efe-e57d-4003-9148-be1539c7a004-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.694158 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tmdl\" (UniqueName: \"kubernetes.io/projected/08470efe-e57d-4003-9148-be1539c7a004-kube-api-access-8tmdl\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:03 crc kubenswrapper[4856]: I0320 14:15:03.694168 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/08470efe-e57d-4003-9148-be1539c7a004-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:04 crc kubenswrapper[4856]: I0320 14:15:04.178715 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" event={"ID":"08470efe-e57d-4003-9148-be1539c7a004","Type":"ContainerDied","Data":"20101387cc66ee83d5f6e114b0d50b14271d4c80a2686261a47848557ae46358"} Mar 20 14:15:04 crc kubenswrapper[4856]: I0320 14:15:04.178762 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20101387cc66ee83d5f6e114b0d50b14271d4c80a2686261a47848557ae46358" Mar 20 14:15:04 crc kubenswrapper[4856]: I0320 14:15:04.179403 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-vpf74" Mar 20 14:15:04 crc kubenswrapper[4856]: I0320 14:15:04.230967 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq"] Mar 20 14:15:04 crc kubenswrapper[4856]: I0320 14:15:04.236915 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-nsknq"] Mar 20 14:15:05 crc kubenswrapper[4856]: I0320 14:15:05.844237 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35c337cd-c045-4fa6-953d-e30cfa4d4ec3" path="/var/lib/kubelet/pods/35c337cd-c045-4fa6-953d-e30cfa4d4ec3/volumes" Mar 20 14:15:34 crc kubenswrapper[4856]: I0320 14:15:34.244214 4856 scope.go:117] "RemoveContainer" containerID="bd3aff4f27c60982965939c2511008090f46f1e7ad95effeaf9f79288c05bd4b" Mar 20 14:15:39 crc kubenswrapper[4856]: I0320 14:15:39.988511 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:15:39 crc kubenswrapper[4856]: I0320 14:15:39.989172 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.005963 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:15:47 crc kubenswrapper[4856]: E0320 14:15:47.006932 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08470efe-e57d-4003-9148-be1539c7a004" containerName="collect-profiles" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.006947 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="08470efe-e57d-4003-9148-be1539c7a004" containerName="collect-profiles" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.007149 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="08470efe-e57d-4003-9148-be1539c7a004" containerName="collect-profiles" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.008374 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.020635 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.172434 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42sr\" (UniqueName: \"kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.172511 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.172549 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.273580 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42sr\" (UniqueName: \"kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.273658 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.273694 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.274197 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.274715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.309184 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42sr\" (UniqueName: \"kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr\") pod \"community-operators-9cvgx\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.334736 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:47 crc kubenswrapper[4856]: I0320 14:15:47.652043 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:15:48 crc kubenswrapper[4856]: I0320 14:15:48.550436 4856 generic.go:334] "Generic (PLEG): container finished" podID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerID="4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26" exitCode=0 Mar 20 14:15:48 crc kubenswrapper[4856]: I0320 14:15:48.550579 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerDied","Data":"4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26"} Mar 20 14:15:48 crc kubenswrapper[4856]: I0320 14:15:48.550715 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerStarted","Data":"886b627cf8901f3d12688ebd59b9f654f486ac0200fe457042afd0723f7061af"} Mar 20 14:15:49 crc kubenswrapper[4856]: I0320 14:15:49.558583 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerStarted","Data":"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c"} Mar 20 14:15:50 crc kubenswrapper[4856]: I0320 14:15:50.566513 4856 generic.go:334] "Generic (PLEG): container finished" podID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerID="80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c" exitCode=0 Mar 20 14:15:50 crc kubenswrapper[4856]: I0320 14:15:50.566603 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerDied","Data":"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c"} Mar 20 14:15:51 crc kubenswrapper[4856]: I0320 14:15:51.576030 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerStarted","Data":"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6"} Mar 20 14:15:51 crc kubenswrapper[4856]: I0320 14:15:51.594019 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9cvgx" podStartSLOduration=2.999612932 podStartE2EDuration="5.593980102s" podCreationTimestamp="2026-03-20 14:15:46 +0000 UTC" firstStartedPulling="2026-03-20 14:15:48.55387778 +0000 UTC m=+3163.434903960" lastFinishedPulling="2026-03-20 14:15:51.14824498 +0000 UTC m=+3166.029271130" observedRunningTime="2026-03-20 14:15:51.592950524 +0000 UTC m=+3166.473976664" watchObservedRunningTime="2026-03-20 14:15:51.593980102 +0000 UTC m=+3166.475006232" Mar 20 14:15:57 crc kubenswrapper[4856]: I0320 14:15:57.335423 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:57 crc kubenswrapper[4856]: I0320 14:15:57.335882 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:57 crc kubenswrapper[4856]: I0320 14:15:57.376786 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:57 crc kubenswrapper[4856]: I0320 14:15:57.661857 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:15:57 crc kubenswrapper[4856]: I0320 14:15:57.718191 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:15:59 crc kubenswrapper[4856]: I0320 14:15:59.636407 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9cvgx" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="registry-server" containerID="cri-o://40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6" gracePeriod=2 Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.056453 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.061477 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities\") pod \"fb3a490a-bc12-4943-aa86-3c4460731c46\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.061590 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content\") pod \"fb3a490a-bc12-4943-aa86-3c4460731c46\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.061698 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42sr\" (UniqueName: \"kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr\") pod \"fb3a490a-bc12-4943-aa86-3c4460731c46\" (UID: \"fb3a490a-bc12-4943-aa86-3c4460731c46\") " Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.062185 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities" (OuterVolumeSpecName: "utilities") pod "fb3a490a-bc12-4943-aa86-3c4460731c46" (UID: "fb3a490a-bc12-4943-aa86-3c4460731c46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.062517 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.071909 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr" (OuterVolumeSpecName: "kube-api-access-c42sr") pod "fb3a490a-bc12-4943-aa86-3c4460731c46" (UID: "fb3a490a-bc12-4943-aa86-3c4460731c46"). InnerVolumeSpecName "kube-api-access-c42sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.123139 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb3a490a-bc12-4943-aa86-3c4460731c46" (UID: "fb3a490a-bc12-4943-aa86-3c4460731c46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.140170 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566936-h4cp5"] Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.140579 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="extract-content" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.140593 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="extract-content" Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.140615 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="extract-utilities" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.140622 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="extract-utilities" Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.140643 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="registry-server" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.140650 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="registry-server" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.140791 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerName="registry-server" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.141346 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.143053 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.143279 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.144356 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.145758 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-h4cp5"] Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.163005 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtk88\" (UniqueName: \"kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88\") pod \"auto-csr-approver-29566936-h4cp5\" (UID: \"ac7289f7-6042-4925-a9ff-a6349fab7e05\") " pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.163136 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb3a490a-bc12-4943-aa86-3c4460731c46-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.163157 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42sr\" (UniqueName: \"kubernetes.io/projected/fb3a490a-bc12-4943-aa86-3c4460731c46-kube-api-access-c42sr\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.264057 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtk88\" (UniqueName: \"kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88\") pod \"auto-csr-approver-29566936-h4cp5\" (UID: \"ac7289f7-6042-4925-a9ff-a6349fab7e05\") " pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.282224 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtk88\" (UniqueName: \"kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88\") pod \"auto-csr-approver-29566936-h4cp5\" (UID: \"ac7289f7-6042-4925-a9ff-a6349fab7e05\") " pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.468332 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.658521 4856 generic.go:334] "Generic (PLEG): container finished" podID="fb3a490a-bc12-4943-aa86-3c4460731c46" containerID="40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6" exitCode=0 Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.658572 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerDied","Data":"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6"} Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.658869 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9cvgx" event={"ID":"fb3a490a-bc12-4943-aa86-3c4460731c46","Type":"ContainerDied","Data":"886b627cf8901f3d12688ebd59b9f654f486ac0200fe457042afd0723f7061af"} Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.658892 4856 scope.go:117] "RemoveContainer" containerID="40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.658605 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9cvgx" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.694192 4856 scope.go:117] "RemoveContainer" containerID="80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.696304 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.702816 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9cvgx"] Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.710157 4856 scope.go:117] "RemoveContainer" containerID="4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.724725 4856 scope.go:117] "RemoveContainer" containerID="40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6" Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.725163 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6\": container with ID starting with 40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6 not found: ID does not exist" containerID="40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.725305 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6"} err="failed to get container status \"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6\": rpc error: code = NotFound desc = could not find container \"40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6\": container with ID starting with 40b0e03d257f936d570bcb93d14e459850283fe694846624da445f5cf0b633b6 not found: ID does not exist" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.725418 4856 scope.go:117] "RemoveContainer" containerID="80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c" Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.725773 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c\": container with ID starting with 80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c not found: ID does not exist" containerID="80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.725802 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c"} err="failed to get container status \"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c\": rpc error: code = NotFound desc = could not find container \"80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c\": container with ID starting with 80ad4693bf66c571926ab864baf153a30246a42d47daae004cb4fa3171d7737c not found: ID does not exist" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.725823 4856 scope.go:117] "RemoveContainer" containerID="4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26" Mar 20 14:16:00 crc kubenswrapper[4856]: E0320 14:16:00.726004 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26\": container with ID starting with 4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26 not found: ID does not exist" containerID="4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.726027 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26"} err="failed to get container status \"4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26\": rpc error: code = NotFound desc = could not find container \"4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26\": container with ID starting with 4ded5a8042360590a245ff614de14079e48b9771f879bfa0049de4fc8a9ead26 not found: ID does not exist" Mar 20 14:16:00 crc kubenswrapper[4856]: I0320 14:16:00.923810 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-h4cp5"] Mar 20 14:16:01 crc kubenswrapper[4856]: I0320 14:16:01.664935 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" event={"ID":"ac7289f7-6042-4925-a9ff-a6349fab7e05","Type":"ContainerStarted","Data":"0bef87c2188e288d2caf7f9448e4c8edce6aa5ab3e70ba293f4ea90fa5a9603b"} Mar 20 14:16:01 crc kubenswrapper[4856]: I0320 14:16:01.835067 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3a490a-bc12-4943-aa86-3c4460731c46" path="/var/lib/kubelet/pods/fb3a490a-bc12-4943-aa86-3c4460731c46/volumes" Mar 20 14:16:02 crc kubenswrapper[4856]: I0320 14:16:02.676593 4856 generic.go:334] "Generic (PLEG): container finished" podID="ac7289f7-6042-4925-a9ff-a6349fab7e05" containerID="4048bda3b61a57979fcb11c37fcca56c80dedd2bd84ded1ed433ef515d1f57f4" exitCode=0 Mar 20 14:16:02 crc kubenswrapper[4856]: I0320 14:16:02.676689 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" event={"ID":"ac7289f7-6042-4925-a9ff-a6349fab7e05","Type":"ContainerDied","Data":"4048bda3b61a57979fcb11c37fcca56c80dedd2bd84ded1ed433ef515d1f57f4"} Mar 20 14:16:03 crc kubenswrapper[4856]: I0320 14:16:03.978977 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.155823 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtk88\" (UniqueName: \"kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88\") pod \"ac7289f7-6042-4925-a9ff-a6349fab7e05\" (UID: \"ac7289f7-6042-4925-a9ff-a6349fab7e05\") " Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.161584 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88" (OuterVolumeSpecName: "kube-api-access-xtk88") pod "ac7289f7-6042-4925-a9ff-a6349fab7e05" (UID: "ac7289f7-6042-4925-a9ff-a6349fab7e05"). InnerVolumeSpecName "kube-api-access-xtk88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.257488 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtk88\" (UniqueName: \"kubernetes.io/projected/ac7289f7-6042-4925-a9ff-a6349fab7e05-kube-api-access-xtk88\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.691710 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" event={"ID":"ac7289f7-6042-4925-a9ff-a6349fab7e05","Type":"ContainerDied","Data":"0bef87c2188e288d2caf7f9448e4c8edce6aa5ab3e70ba293f4ea90fa5a9603b"} Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.691750 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bef87c2188e288d2caf7f9448e4c8edce6aa5ab3e70ba293f4ea90fa5a9603b" Mar 20 14:16:04 crc kubenswrapper[4856]: I0320 14:16:04.691803 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-h4cp5" Mar 20 14:16:05 crc kubenswrapper[4856]: I0320 14:16:05.067184 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-srhjx"] Mar 20 14:16:05 crc kubenswrapper[4856]: I0320 14:16:05.075833 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-srhjx"] Mar 20 14:16:05 crc kubenswrapper[4856]: I0320 14:16:05.829906 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3a798da-a318-4bff-a41d-8e35dff4b66e" path="/var/lib/kubelet/pods/f3a798da-a318-4bff-a41d-8e35dff4b66e/volumes" Mar 20 14:16:09 crc kubenswrapper[4856]: I0320 14:16:09.987200 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:16:09 crc kubenswrapper[4856]: I0320 14:16:09.987775 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:16:34 crc kubenswrapper[4856]: I0320 14:16:34.302509 4856 scope.go:117] "RemoveContainer" containerID="6de0045c9d42ca9c4e1ee154c89fdbaef2b05bd68a67e4849218b4939c42eeb7" Mar 20 14:16:39 crc kubenswrapper[4856]: I0320 14:16:39.987595 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:16:39 crc kubenswrapper[4856]: I0320 14:16:39.988421 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:16:39 crc kubenswrapper[4856]: I0320 14:16:39.988499 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:16:39 crc kubenswrapper[4856]: I0320 14:16:39.989581 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:16:39 crc kubenswrapper[4856]: I0320 14:16:39.989684 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" gracePeriod=600 Mar 20 14:16:40 crc kubenswrapper[4856]: E0320 14:16:40.113955 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:16:40 crc kubenswrapper[4856]: I0320 14:16:40.985805 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" exitCode=0 Mar 20 14:16:40 crc kubenswrapper[4856]: I0320 14:16:40.985866 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d"} Mar 20 14:16:40 crc kubenswrapper[4856]: I0320 14:16:40.986109 4856 scope.go:117] "RemoveContainer" containerID="6c33083bf7f776ff9a8a33f1ba32356a0e7714375f33fbf76552f6dafe74c74b" Mar 20 14:16:40 crc kubenswrapper[4856]: I0320 14:16:40.986899 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:16:40 crc kubenswrapper[4856]: E0320 14:16:40.987300 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:16:55 crc kubenswrapper[4856]: I0320 14:16:55.823373 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:16:55 crc kubenswrapper[4856]: E0320 14:16:55.824157 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:17:07 crc kubenswrapper[4856]: I0320 14:17:07.820466 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:17:07 crc kubenswrapper[4856]: E0320 14:17:07.822189 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:17:18 crc kubenswrapper[4856]: I0320 14:17:18.819741 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:17:18 crc kubenswrapper[4856]: E0320 14:17:18.820489 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:17:29 crc kubenswrapper[4856]: I0320 14:17:29.819628 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:17:29 crc kubenswrapper[4856]: E0320 14:17:29.820344 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:17:44 crc kubenswrapper[4856]: I0320 14:17:44.819905 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:17:44 crc kubenswrapper[4856]: E0320 14:17:44.820780 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:17:58 crc kubenswrapper[4856]: I0320 14:17:58.820042 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:17:58 crc kubenswrapper[4856]: E0320 14:17:58.820932 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.190650 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566938-9kk29"] Mar 20 14:18:00 crc kubenswrapper[4856]: E0320 14:18:00.191543 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac7289f7-6042-4925-a9ff-a6349fab7e05" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.191562 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac7289f7-6042-4925-a9ff-a6349fab7e05" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.191751 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac7289f7-6042-4925-a9ff-a6349fab7e05" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.192373 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.201642 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.201721 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.202870 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.214436 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdkcm\" (UniqueName: \"kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm\") pod \"auto-csr-approver-29566938-9kk29\" (UID: \"737aaad0-34bc-44c8-aa37-b49e938af62a\") " pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.214704 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-9kk29"] Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.315999 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdkcm\" (UniqueName: \"kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm\") pod \"auto-csr-approver-29566938-9kk29\" (UID: \"737aaad0-34bc-44c8-aa37-b49e938af62a\") " pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.346987 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdkcm\" (UniqueName: \"kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm\") pod \"auto-csr-approver-29566938-9kk29\" (UID: \"737aaad0-34bc-44c8-aa37-b49e938af62a\") " pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.512530 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.928477 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-9kk29"] Mar 20 14:18:00 crc kubenswrapper[4856]: I0320 14:18:00.934792 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:18:01 crc kubenswrapper[4856]: I0320 14:18:01.668669 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-9kk29" event={"ID":"737aaad0-34bc-44c8-aa37-b49e938af62a","Type":"ContainerStarted","Data":"c80fd5f45ee47ade3909b40346c0962ce072e334fdf4cb68a15e1ca81a653dd2"} Mar 20 14:18:02 crc kubenswrapper[4856]: I0320 14:18:02.675148 4856 generic.go:334] "Generic (PLEG): container finished" podID="737aaad0-34bc-44c8-aa37-b49e938af62a" containerID="edf9ddbcfed6996f2009b404c0247671b6b05aa3d779fc8cff96206af9d1abe0" exitCode=0 Mar 20 14:18:02 crc kubenswrapper[4856]: I0320 14:18:02.675199 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-9kk29" event={"ID":"737aaad0-34bc-44c8-aa37-b49e938af62a","Type":"ContainerDied","Data":"edf9ddbcfed6996f2009b404c0247671b6b05aa3d779fc8cff96206af9d1abe0"} Mar 20 14:18:03 crc kubenswrapper[4856]: I0320 14:18:03.936104 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:03 crc kubenswrapper[4856]: I0320 14:18:03.969785 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdkcm\" (UniqueName: \"kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm\") pod \"737aaad0-34bc-44c8-aa37-b49e938af62a\" (UID: \"737aaad0-34bc-44c8-aa37-b49e938af62a\") " Mar 20 14:18:03 crc kubenswrapper[4856]: I0320 14:18:03.975470 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm" (OuterVolumeSpecName: "kube-api-access-pdkcm") pod "737aaad0-34bc-44c8-aa37-b49e938af62a" (UID: "737aaad0-34bc-44c8-aa37-b49e938af62a"). InnerVolumeSpecName "kube-api-access-pdkcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:18:04 crc kubenswrapper[4856]: I0320 14:18:04.071885 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdkcm\" (UniqueName: \"kubernetes.io/projected/737aaad0-34bc-44c8-aa37-b49e938af62a-kube-api-access-pdkcm\") on node \"crc\" DevicePath \"\"" Mar 20 14:18:04 crc kubenswrapper[4856]: I0320 14:18:04.694387 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-9kk29" event={"ID":"737aaad0-34bc-44c8-aa37-b49e938af62a","Type":"ContainerDied","Data":"c80fd5f45ee47ade3909b40346c0962ce072e334fdf4cb68a15e1ca81a653dd2"} Mar 20 14:18:04 crc kubenswrapper[4856]: I0320 14:18:04.694449 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c80fd5f45ee47ade3909b40346c0962ce072e334fdf4cb68a15e1ca81a653dd2" Mar 20 14:18:04 crc kubenswrapper[4856]: I0320 14:18:04.694499 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-9kk29" Mar 20 14:18:05 crc kubenswrapper[4856]: I0320 14:18:05.009138 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-p9bz8"] Mar 20 14:18:05 crc kubenswrapper[4856]: I0320 14:18:05.013516 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-p9bz8"] Mar 20 14:18:05 crc kubenswrapper[4856]: I0320 14:18:05.834660 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98b813be-bb3c-487f-afa0-98960c34bd31" path="/var/lib/kubelet/pods/98b813be-bb3c-487f-afa0-98960c34bd31/volumes" Mar 20 14:18:10 crc kubenswrapper[4856]: I0320 14:18:10.819983 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:18:10 crc kubenswrapper[4856]: E0320 14:18:10.820877 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:18:22 crc kubenswrapper[4856]: I0320 14:18:22.821017 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:18:22 crc kubenswrapper[4856]: E0320 14:18:22.822191 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:18:33 crc kubenswrapper[4856]: I0320 14:18:33.820137 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:18:33 crc kubenswrapper[4856]: E0320 14:18:33.820973 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:18:34 crc kubenswrapper[4856]: I0320 14:18:34.409628 4856 scope.go:117] "RemoveContainer" containerID="ecff83741f67618729e6e444a415da8f8b855eeb4fb52f86aad147b9e77ff992" Mar 20 14:18:46 crc kubenswrapper[4856]: I0320 14:18:46.820674 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:18:46 crc kubenswrapper[4856]: E0320 14:18:46.822603 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:18:57 crc kubenswrapper[4856]: I0320 14:18:57.820500 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:18:57 crc kubenswrapper[4856]: E0320 14:18:57.821701 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.356312 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:07 crc kubenswrapper[4856]: E0320 14:19:07.357173 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737aaad0-34bc-44c8-aa37-b49e938af62a" containerName="oc" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.357185 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="737aaad0-34bc-44c8-aa37-b49e938af62a" containerName="oc" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.357388 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="737aaad0-34bc-44c8-aa37-b49e938af62a" containerName="oc" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.358296 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.377248 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.416400 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx49v\" (UniqueName: \"kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.416457 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.416529 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.517919 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx49v\" (UniqueName: \"kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.518184 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.518342 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.518817 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.518828 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.537093 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx49v\" (UniqueName: \"kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v\") pod \"certified-operators-2mfrq\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:07 crc kubenswrapper[4856]: I0320 14:19:07.680071 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:08 crc kubenswrapper[4856]: I0320 14:19:08.173914 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:08 crc kubenswrapper[4856]: I0320 14:19:08.203529 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerStarted","Data":"1fb80f4a563639b4a6a25d6ec2aaf451b8472e1726d068e9b9899256b27c3957"} Mar 20 14:19:08 crc kubenswrapper[4856]: I0320 14:19:08.820102 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:19:08 crc kubenswrapper[4856]: E0320 14:19:08.820725 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:19:09 crc kubenswrapper[4856]: I0320 14:19:09.213705 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerID="02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002" exitCode=0 Mar 20 14:19:09 crc kubenswrapper[4856]: I0320 14:19:09.213798 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerDied","Data":"02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002"} Mar 20 14:19:11 crc kubenswrapper[4856]: I0320 14:19:11.238389 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerID="105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa" exitCode=0 Mar 20 14:19:11 crc kubenswrapper[4856]: I0320 14:19:11.238449 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerDied","Data":"105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa"} Mar 20 14:19:13 crc kubenswrapper[4856]: I0320 14:19:13.261145 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerStarted","Data":"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a"} Mar 20 14:19:13 crc kubenswrapper[4856]: I0320 14:19:13.286723 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2mfrq" podStartSLOduration=2.895420128 podStartE2EDuration="6.286705601s" podCreationTimestamp="2026-03-20 14:19:07 +0000 UTC" firstStartedPulling="2026-03-20 14:19:09.215710446 +0000 UTC m=+3364.096736596" lastFinishedPulling="2026-03-20 14:19:12.606995889 +0000 UTC m=+3367.488022069" observedRunningTime="2026-03-20 14:19:13.286212458 +0000 UTC m=+3368.167238618" watchObservedRunningTime="2026-03-20 14:19:13.286705601 +0000 UTC m=+3368.167731731" Mar 20 14:19:17 crc kubenswrapper[4856]: I0320 14:19:17.680819 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:17 crc kubenswrapper[4856]: I0320 14:19:17.681564 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:17 crc kubenswrapper[4856]: I0320 14:19:17.731255 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:18 crc kubenswrapper[4856]: I0320 14:19:18.343857 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:18 crc kubenswrapper[4856]: I0320 14:19:18.417446 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:19 crc kubenswrapper[4856]: I0320 14:19:19.819448 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:19:19 crc kubenswrapper[4856]: E0320 14:19:19.819935 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.317957 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2mfrq" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="registry-server" containerID="cri-o://30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a" gracePeriod=2 Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.787926 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.817376 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx49v\" (UniqueName: \"kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v\") pod \"e2051b29-77f7-4d51-8c66-0a83dbb95915\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.817509 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content\") pod \"e2051b29-77f7-4d51-8c66-0a83dbb95915\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.817560 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities\") pod \"e2051b29-77f7-4d51-8c66-0a83dbb95915\" (UID: \"e2051b29-77f7-4d51-8c66-0a83dbb95915\") " Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.818511 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities" (OuterVolumeSpecName: "utilities") pod "e2051b29-77f7-4d51-8c66-0a83dbb95915" (UID: "e2051b29-77f7-4d51-8c66-0a83dbb95915"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.823745 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v" (OuterVolumeSpecName: "kube-api-access-rx49v") pod "e2051b29-77f7-4d51-8c66-0a83dbb95915" (UID: "e2051b29-77f7-4d51-8c66-0a83dbb95915"). InnerVolumeSpecName "kube-api-access-rx49v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.882744 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e2051b29-77f7-4d51-8c66-0a83dbb95915" (UID: "e2051b29-77f7-4d51-8c66-0a83dbb95915"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.918811 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx49v\" (UniqueName: \"kubernetes.io/projected/e2051b29-77f7-4d51-8c66-0a83dbb95915-kube-api-access-rx49v\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.918857 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:20 crc kubenswrapper[4856]: I0320 14:19:20.918876 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e2051b29-77f7-4d51-8c66-0a83dbb95915-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.336654 4856 generic.go:334] "Generic (PLEG): container finished" podID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerID="30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a" exitCode=0 Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.336700 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerDied","Data":"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a"} Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.336729 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2mfrq" event={"ID":"e2051b29-77f7-4d51-8c66-0a83dbb95915","Type":"ContainerDied","Data":"1fb80f4a563639b4a6a25d6ec2aaf451b8472e1726d068e9b9899256b27c3957"} Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.336747 4856 scope.go:117] "RemoveContainer" containerID="30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.336873 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2mfrq" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.377015 4856 scope.go:117] "RemoveContainer" containerID="105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.378802 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.390116 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2mfrq"] Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.403766 4856 scope.go:117] "RemoveContainer" containerID="02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.443065 4856 scope.go:117] "RemoveContainer" containerID="30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a" Mar 20 14:19:21 crc kubenswrapper[4856]: E0320 14:19:21.443519 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a\": container with ID starting with 30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a not found: ID does not exist" containerID="30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.443563 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a"} err="failed to get container status \"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a\": rpc error: code = NotFound desc = could not find container \"30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a\": container with ID starting with 30bcf76ec532e18c7c94bc4cced8500196a9086c3ba81606311771400db5df8a not found: ID does not exist" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.443588 4856 scope.go:117] "RemoveContainer" containerID="105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa" Mar 20 14:19:21 crc kubenswrapper[4856]: E0320 14:19:21.443982 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa\": container with ID starting with 105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa not found: ID does not exist" containerID="105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.444003 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa"} err="failed to get container status \"105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa\": rpc error: code = NotFound desc = could not find container \"105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa\": container with ID starting with 105fce209bbca5aa9071f5809822f600b14adc778d8d7d5ec8b190c059985afa not found: ID does not exist" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.444016 4856 scope.go:117] "RemoveContainer" containerID="02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002" Mar 20 14:19:21 crc kubenswrapper[4856]: E0320 14:19:21.444318 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002\": container with ID starting with 02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002 not found: ID does not exist" containerID="02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.444343 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002"} err="failed to get container status \"02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002\": rpc error: code = NotFound desc = could not find container \"02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002\": container with ID starting with 02ddf3db304194d0a1b9a945d0f50365df1f6eb86326dcf0960a4fda09a41002 not found: ID does not exist" Mar 20 14:19:21 crc kubenswrapper[4856]: I0320 14:19:21.835182 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" path="/var/lib/kubelet/pods/e2051b29-77f7-4d51-8c66-0a83dbb95915/volumes" Mar 20 14:19:33 crc kubenswrapper[4856]: I0320 14:19:33.819852 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:19:33 crc kubenswrapper[4856]: E0320 14:19:33.820717 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:19:47 crc kubenswrapper[4856]: I0320 14:19:47.819591 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:19:47 crc kubenswrapper[4856]: E0320 14:19:47.820537 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.141921 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566940-86spd"] Mar 20 14:20:00 crc kubenswrapper[4856]: E0320 14:20:00.142816 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="extract-utilities" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.142834 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="extract-utilities" Mar 20 14:20:00 crc kubenswrapper[4856]: E0320 14:20:00.142851 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.142861 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4856]: E0320 14:20:00.142888 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="extract-content" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.142896 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="extract-content" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.143047 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2051b29-77f7-4d51-8c66-0a83dbb95915" containerName="registry-server" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.143505 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.145637 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.146016 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.146060 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.155974 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-86spd"] Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.292149 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6zfl\" (UniqueName: \"kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl\") pod \"auto-csr-approver-29566940-86spd\" (UID: \"f66ec89b-f68a-4018-958a-746d7b1b3d11\") " pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.394003 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6zfl\" (UniqueName: \"kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl\") pod \"auto-csr-approver-29566940-86spd\" (UID: \"f66ec89b-f68a-4018-958a-746d7b1b3d11\") " pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.415218 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6zfl\" (UniqueName: \"kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl\") pod \"auto-csr-approver-29566940-86spd\" (UID: \"f66ec89b-f68a-4018-958a-746d7b1b3d11\") " pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.464058 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:00 crc kubenswrapper[4856]: I0320 14:20:00.873670 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-86spd"] Mar 20 14:20:01 crc kubenswrapper[4856]: I0320 14:20:01.624970 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-86spd" event={"ID":"f66ec89b-f68a-4018-958a-746d7b1b3d11","Type":"ContainerStarted","Data":"1e1a824bc25b647d6e266f3cd8d7e74fa21f5d6621aa5ab3585eb8d77df426b7"} Mar 20 14:20:01 crc kubenswrapper[4856]: I0320 14:20:01.820804 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:20:01 crc kubenswrapper[4856]: E0320 14:20:01.821176 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:03 crc kubenswrapper[4856]: I0320 14:20:03.641048 4856 generic.go:334] "Generic (PLEG): container finished" podID="f66ec89b-f68a-4018-958a-746d7b1b3d11" containerID="a4c2ad1223c0ed0f566f4e1d0328a497c7be9119f719a070f60c2b073021ab17" exitCode=0 Mar 20 14:20:03 crc kubenswrapper[4856]: I0320 14:20:03.641113 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-86spd" event={"ID":"f66ec89b-f68a-4018-958a-746d7b1b3d11","Type":"ContainerDied","Data":"a4c2ad1223c0ed0f566f4e1d0328a497c7be9119f719a070f60c2b073021ab17"} Mar 20 14:20:04 crc kubenswrapper[4856]: I0320 14:20:04.952131 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.060900 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6zfl\" (UniqueName: \"kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl\") pod \"f66ec89b-f68a-4018-958a-746d7b1b3d11\" (UID: \"f66ec89b-f68a-4018-958a-746d7b1b3d11\") " Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.067377 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl" (OuterVolumeSpecName: "kube-api-access-p6zfl") pod "f66ec89b-f68a-4018-958a-746d7b1b3d11" (UID: "f66ec89b-f68a-4018-958a-746d7b1b3d11"). InnerVolumeSpecName "kube-api-access-p6zfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.162498 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6zfl\" (UniqueName: \"kubernetes.io/projected/f66ec89b-f68a-4018-958a-746d7b1b3d11-kube-api-access-p6zfl\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.656655 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-86spd" event={"ID":"f66ec89b-f68a-4018-958a-746d7b1b3d11","Type":"ContainerDied","Data":"1e1a824bc25b647d6e266f3cd8d7e74fa21f5d6621aa5ab3585eb8d77df426b7"} Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.657002 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e1a824bc25b647d6e266f3cd8d7e74fa21f5d6621aa5ab3585eb8d77df426b7" Mar 20 14:20:05 crc kubenswrapper[4856]: I0320 14:20:05.656698 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-86spd" Mar 20 14:20:05 crc kubenswrapper[4856]: E0320 14:20:05.769790 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66ec89b_f68a_4018_958a_746d7b1b3d11.slice/crio-1e1a824bc25b647d6e266f3cd8d7e74fa21f5d6621aa5ab3585eb8d77df426b7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf66ec89b_f68a_4018_958a_746d7b1b3d11.slice\": RecentStats: unable to find data in memory cache]" Mar 20 14:20:06 crc kubenswrapper[4856]: I0320 14:20:06.020758 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-pqnwv"] Mar 20 14:20:06 crc kubenswrapper[4856]: I0320 14:20:06.027297 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-pqnwv"] Mar 20 14:20:07 crc kubenswrapper[4856]: I0320 14:20:07.827240 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5b1f2d-7e51-4c02-8222-36edd975c8a6" path="/var/lib/kubelet/pods/dd5b1f2d-7e51-4c02-8222-36edd975c8a6/volumes" Mar 20 14:20:12 crc kubenswrapper[4856]: I0320 14:20:12.820719 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:20:12 crc kubenswrapper[4856]: E0320 14:20:12.821474 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:23 crc kubenswrapper[4856]: I0320 14:20:23.819758 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:20:23 crc kubenswrapper[4856]: E0320 14:20:23.820594 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:34 crc kubenswrapper[4856]: I0320 14:20:34.520620 4856 scope.go:117] "RemoveContainer" containerID="5fa5e93bb54791981d5ea566e8d13383ad5ff72df8a6e713ecc1214a81a1a247" Mar 20 14:20:37 crc kubenswrapper[4856]: I0320 14:20:37.820388 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:20:37 crc kubenswrapper[4856]: E0320 14:20:37.821237 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.127214 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:20:45 crc kubenswrapper[4856]: E0320 14:20:45.128173 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ec89b-f68a-4018-958a-746d7b1b3d11" containerName="oc" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.128190 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ec89b-f68a-4018-958a-746d7b1b3d11" containerName="oc" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.128411 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66ec89b-f68a-4018-958a-746d7b1b3d11" containerName="oc" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.130444 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.140418 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.286035 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.286173 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fklb\" (UniqueName: \"kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.286230 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.387529 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.387617 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fklb\" (UniqueName: \"kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.387672 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.388163 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.388195 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.412722 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fklb\" (UniqueName: \"kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb\") pod \"redhat-operators-v75rn\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.459153 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.929453 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:20:45 crc kubenswrapper[4856]: I0320 14:20:45.955763 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerStarted","Data":"6ff4c5d2e81ad6355e60c322cd17d53319fb36e6eab29725d2d5055c316aec3b"} Mar 20 14:20:46 crc kubenswrapper[4856]: I0320 14:20:46.966201 4856 generic.go:334] "Generic (PLEG): container finished" podID="1430da13-aca2-41f5-8255-1c4bc9603550" containerID="dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c" exitCode=0 Mar 20 14:20:46 crc kubenswrapper[4856]: I0320 14:20:46.966263 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerDied","Data":"dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c"} Mar 20 14:20:48 crc kubenswrapper[4856]: I0320 14:20:48.987358 4856 generic.go:334] "Generic (PLEG): container finished" podID="1430da13-aca2-41f5-8255-1c4bc9603550" containerID="5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52" exitCode=0 Mar 20 14:20:48 crc kubenswrapper[4856]: I0320 14:20:48.987495 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerDied","Data":"5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52"} Mar 20 14:20:49 crc kubenswrapper[4856]: I0320 14:20:49.819500 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:20:49 crc kubenswrapper[4856]: E0320 14:20:49.820205 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:20:49 crc kubenswrapper[4856]: I0320 14:20:49.996091 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerStarted","Data":"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1"} Mar 20 14:20:55 crc kubenswrapper[4856]: I0320 14:20:55.459565 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:55 crc kubenswrapper[4856]: I0320 14:20:55.460205 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:20:56 crc kubenswrapper[4856]: I0320 14:20:56.505937 4856 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v75rn" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="registry-server" probeResult="failure" output=< Mar 20 14:20:56 crc kubenswrapper[4856]: timeout: failed to connect service ":50051" within 1s Mar 20 14:20:56 crc kubenswrapper[4856]: > Mar 20 14:21:02 crc kubenswrapper[4856]: I0320 14:21:02.823229 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:21:02 crc kubenswrapper[4856]: E0320 14:21:02.824302 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:21:05 crc kubenswrapper[4856]: I0320 14:21:05.506506 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:21:05 crc kubenswrapper[4856]: I0320 14:21:05.532745 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v75rn" podStartSLOduration=18.106678137 podStartE2EDuration="20.532727704s" podCreationTimestamp="2026-03-20 14:20:45 +0000 UTC" firstStartedPulling="2026-03-20 14:20:46.968673418 +0000 UTC m=+3461.849699558" lastFinishedPulling="2026-03-20 14:20:49.394722985 +0000 UTC m=+3464.275749125" observedRunningTime="2026-03-20 14:20:50.015711839 +0000 UTC m=+3464.896737989" watchObservedRunningTime="2026-03-20 14:21:05.532727704 +0000 UTC m=+3480.413753834" Mar 20 14:21:05 crc kubenswrapper[4856]: I0320 14:21:05.549056 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:21:05 crc kubenswrapper[4856]: I0320 14:21:05.750942 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.131928 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v75rn" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="registry-server" containerID="cri-o://89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1" gracePeriod=2 Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.518175 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.637446 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fklb\" (UniqueName: \"kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb\") pod \"1430da13-aca2-41f5-8255-1c4bc9603550\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.637547 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content\") pod \"1430da13-aca2-41f5-8255-1c4bc9603550\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.637588 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities\") pod \"1430da13-aca2-41f5-8255-1c4bc9603550\" (UID: \"1430da13-aca2-41f5-8255-1c4bc9603550\") " Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.638523 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities" (OuterVolumeSpecName: "utilities") pod "1430da13-aca2-41f5-8255-1c4bc9603550" (UID: "1430da13-aca2-41f5-8255-1c4bc9603550"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.643439 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb" (OuterVolumeSpecName: "kube-api-access-7fklb") pod "1430da13-aca2-41f5-8255-1c4bc9603550" (UID: "1430da13-aca2-41f5-8255-1c4bc9603550"). InnerVolumeSpecName "kube-api-access-7fklb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.739871 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.739907 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fklb\" (UniqueName: \"kubernetes.io/projected/1430da13-aca2-41f5-8255-1c4bc9603550-kube-api-access-7fklb\") on node \"crc\" DevicePath \"\"" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.765846 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1430da13-aca2-41f5-8255-1c4bc9603550" (UID: "1430da13-aca2-41f5-8255-1c4bc9603550"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:21:07 crc kubenswrapper[4856]: I0320 14:21:07.840894 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1430da13-aca2-41f5-8255-1c4bc9603550-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.141950 4856 generic.go:334] "Generic (PLEG): container finished" podID="1430da13-aca2-41f5-8255-1c4bc9603550" containerID="89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1" exitCode=0 Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.141992 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerDied","Data":"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1"} Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.142018 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v75rn" event={"ID":"1430da13-aca2-41f5-8255-1c4bc9603550","Type":"ContainerDied","Data":"6ff4c5d2e81ad6355e60c322cd17d53319fb36e6eab29725d2d5055c316aec3b"} Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.142019 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v75rn" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.142034 4856 scope.go:117] "RemoveContainer" containerID="89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.173213 4856 scope.go:117] "RemoveContainer" containerID="5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.182382 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.189448 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v75rn"] Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.196509 4856 scope.go:117] "RemoveContainer" containerID="dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.214549 4856 scope.go:117] "RemoveContainer" containerID="89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1" Mar 20 14:21:08 crc kubenswrapper[4856]: E0320 14:21:08.215033 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1\": container with ID starting with 89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1 not found: ID does not exist" containerID="89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.215079 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1"} err="failed to get container status \"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1\": rpc error: code = NotFound desc = could not find container \"89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1\": container with ID starting with 89d686abb1ce5d4b350ec75f1640b8c3816a01a14cd8f7beedfb0b2f829f17a1 not found: ID does not exist" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.215105 4856 scope.go:117] "RemoveContainer" containerID="5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52" Mar 20 14:21:08 crc kubenswrapper[4856]: E0320 14:21:08.216355 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52\": container with ID starting with 5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52 not found: ID does not exist" containerID="5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.216392 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52"} err="failed to get container status \"5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52\": rpc error: code = NotFound desc = could not find container \"5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52\": container with ID starting with 5e1ddce13124353db206565e3c83601976696b50517fb25f9d7b8501a722db52 not found: ID does not exist" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.216449 4856 scope.go:117] "RemoveContainer" containerID="dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c" Mar 20 14:21:08 crc kubenswrapper[4856]: E0320 14:21:08.216778 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c\": container with ID starting with dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c not found: ID does not exist" containerID="dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c" Mar 20 14:21:08 crc kubenswrapper[4856]: I0320 14:21:08.216857 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c"} err="failed to get container status \"dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c\": rpc error: code = NotFound desc = could not find container \"dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c\": container with ID starting with dc461601cab0259b24ceef6d1f851751d30b355772c3a1bb6e6b846a4c1e889c not found: ID does not exist" Mar 20 14:21:09 crc kubenswrapper[4856]: I0320 14:21:09.828941 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" path="/var/lib/kubelet/pods/1430da13-aca2-41f5-8255-1c4bc9603550/volumes" Mar 20 14:21:16 crc kubenswrapper[4856]: I0320 14:21:16.819852 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:21:16 crc kubenswrapper[4856]: E0320 14:21:16.820861 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:21:27 crc kubenswrapper[4856]: I0320 14:21:27.820087 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:21:27 crc kubenswrapper[4856]: E0320 14:21:27.821091 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:21:41 crc kubenswrapper[4856]: I0320 14:21:41.820255 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:21:42 crc kubenswrapper[4856]: I0320 14:21:42.396724 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f"} Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.144501 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566942-2j49n"] Mar 20 14:22:00 crc kubenswrapper[4856]: E0320 14:22:00.145247 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="extract-utilities" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.145260 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="extract-utilities" Mar 20 14:22:00 crc kubenswrapper[4856]: E0320 14:22:00.145300 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="extract-content" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.145308 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="extract-content" Mar 20 14:22:00 crc kubenswrapper[4856]: E0320 14:22:00.145314 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.145321 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.145452 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1430da13-aca2-41f5-8255-1c4bc9603550" containerName="registry-server" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.145915 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.148386 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.148592 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.155359 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-2j49n"] Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.158811 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.259482 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfh7\" (UniqueName: \"kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7\") pod \"auto-csr-approver-29566942-2j49n\" (UID: \"13c88e41-149c-4ad7-9b82-94979ec6ceeb\") " pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.360562 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfh7\" (UniqueName: \"kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7\") pod \"auto-csr-approver-29566942-2j49n\" (UID: \"13c88e41-149c-4ad7-9b82-94979ec6ceeb\") " pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.381633 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfh7\" (UniqueName: \"kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7\") pod \"auto-csr-approver-29566942-2j49n\" (UID: \"13c88e41-149c-4ad7-9b82-94979ec6ceeb\") " pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.481642 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:00 crc kubenswrapper[4856]: I0320 14:22:00.898586 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-2j49n"] Mar 20 14:22:00 crc kubenswrapper[4856]: W0320 14:22:00.904561 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13c88e41_149c_4ad7_9b82_94979ec6ceeb.slice/crio-6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0 WatchSource:0}: Error finding container 6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0: Status 404 returned error can't find the container with id 6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0 Mar 20 14:22:01 crc kubenswrapper[4856]: I0320 14:22:01.522248 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-2j49n" event={"ID":"13c88e41-149c-4ad7-9b82-94979ec6ceeb","Type":"ContainerStarted","Data":"6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0"} Mar 20 14:22:02 crc kubenswrapper[4856]: I0320 14:22:02.532128 4856 generic.go:334] "Generic (PLEG): container finished" podID="13c88e41-149c-4ad7-9b82-94979ec6ceeb" containerID="af002cb82aa149c9a4d5b242f018c1ca6bce5ecfb59b9e5e7bba661eca5f28ab" exitCode=0 Mar 20 14:22:02 crc kubenswrapper[4856]: I0320 14:22:02.532198 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-2j49n" event={"ID":"13c88e41-149c-4ad7-9b82-94979ec6ceeb","Type":"ContainerDied","Data":"af002cb82aa149c9a4d5b242f018c1ca6bce5ecfb59b9e5e7bba661eca5f28ab"} Mar 20 14:22:03 crc kubenswrapper[4856]: I0320 14:22:03.829009 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.012789 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqfh7\" (UniqueName: \"kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7\") pod \"13c88e41-149c-4ad7-9b82-94979ec6ceeb\" (UID: \"13c88e41-149c-4ad7-9b82-94979ec6ceeb\") " Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.019052 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7" (OuterVolumeSpecName: "kube-api-access-xqfh7") pod "13c88e41-149c-4ad7-9b82-94979ec6ceeb" (UID: "13c88e41-149c-4ad7-9b82-94979ec6ceeb"). InnerVolumeSpecName "kube-api-access-xqfh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.115611 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqfh7\" (UniqueName: \"kubernetes.io/projected/13c88e41-149c-4ad7-9b82-94979ec6ceeb-kube-api-access-xqfh7\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.546249 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-2j49n" event={"ID":"13c88e41-149c-4ad7-9b82-94979ec6ceeb","Type":"ContainerDied","Data":"6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0"} Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.546320 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6bda495f0c1292b72ca3519e8a39990faaee498c965c0201bb5639896f5893b0" Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.546365 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-2j49n" Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.893903 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-h4cp5"] Mar 20 14:22:04 crc kubenswrapper[4856]: I0320 14:22:04.928893 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-h4cp5"] Mar 20 14:22:05 crc kubenswrapper[4856]: I0320 14:22:05.837281 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac7289f7-6042-4925-a9ff-a6349fab7e05" path="/var/lib/kubelet/pods/ac7289f7-6042-4925-a9ff-a6349fab7e05/volumes" Mar 20 14:22:34 crc kubenswrapper[4856]: I0320 14:22:34.619531 4856 scope.go:117] "RemoveContainer" containerID="4048bda3b61a57979fcb11c37fcca56c80dedd2bd84ded1ed433ef515d1f57f4" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.149496 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566944-plgqk"] Mar 20 14:24:00 crc kubenswrapper[4856]: E0320 14:24:00.150796 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c88e41-149c-4ad7-9b82-94979ec6ceeb" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.150815 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c88e41-149c-4ad7-9b82-94979ec6ceeb" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.150972 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c88e41-149c-4ad7-9b82-94979ec6ceeb" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.151439 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.156784 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.156848 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.156864 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.157456 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-plgqk"] Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.303738 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbt59\" (UniqueName: \"kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59\") pod \"auto-csr-approver-29566944-plgqk\" (UID: \"d18e4fe0-a1b7-48bc-843b-94faa5785712\") " pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.405497 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbt59\" (UniqueName: \"kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59\") pod \"auto-csr-approver-29566944-plgqk\" (UID: \"d18e4fe0-a1b7-48bc-843b-94faa5785712\") " pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.425638 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbt59\" (UniqueName: \"kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59\") pod \"auto-csr-approver-29566944-plgqk\" (UID: \"d18e4fe0-a1b7-48bc-843b-94faa5785712\") " pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.473041 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.876311 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-plgqk"] Mar 20 14:24:00 crc kubenswrapper[4856]: I0320 14:24:00.882844 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:24:01 crc kubenswrapper[4856]: I0320 14:24:01.495165 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-plgqk" event={"ID":"d18e4fe0-a1b7-48bc-843b-94faa5785712","Type":"ContainerStarted","Data":"f6036f981ebd6c579ee56dc0d86ccc670bd0f034d6f5e44c0e399e2f2e2d0fff"} Mar 20 14:24:03 crc kubenswrapper[4856]: I0320 14:24:03.513540 4856 generic.go:334] "Generic (PLEG): container finished" podID="d18e4fe0-a1b7-48bc-843b-94faa5785712" containerID="828b19ce9293f83dee927f31c83f084f54b2d611b4e4aeaa9bfef9388b8895b7" exitCode=0 Mar 20 14:24:03 crc kubenswrapper[4856]: I0320 14:24:03.513616 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-plgqk" event={"ID":"d18e4fe0-a1b7-48bc-843b-94faa5785712","Type":"ContainerDied","Data":"828b19ce9293f83dee927f31c83f084f54b2d611b4e4aeaa9bfef9388b8895b7"} Mar 20 14:24:04 crc kubenswrapper[4856]: I0320 14:24:04.813170 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:04 crc kubenswrapper[4856]: I0320 14:24:04.875144 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbt59\" (UniqueName: \"kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59\") pod \"d18e4fe0-a1b7-48bc-843b-94faa5785712\" (UID: \"d18e4fe0-a1b7-48bc-843b-94faa5785712\") " Mar 20 14:24:04 crc kubenswrapper[4856]: I0320 14:24:04.885490 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59" (OuterVolumeSpecName: "kube-api-access-lbt59") pod "d18e4fe0-a1b7-48bc-843b-94faa5785712" (UID: "d18e4fe0-a1b7-48bc-843b-94faa5785712"). InnerVolumeSpecName "kube-api-access-lbt59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:24:04 crc kubenswrapper[4856]: I0320 14:24:04.976576 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbt59\" (UniqueName: \"kubernetes.io/projected/d18e4fe0-a1b7-48bc-843b-94faa5785712-kube-api-access-lbt59\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:05 crc kubenswrapper[4856]: I0320 14:24:05.531306 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-plgqk" event={"ID":"d18e4fe0-a1b7-48bc-843b-94faa5785712","Type":"ContainerDied","Data":"f6036f981ebd6c579ee56dc0d86ccc670bd0f034d6f5e44c0e399e2f2e2d0fff"} Mar 20 14:24:05 crc kubenswrapper[4856]: I0320 14:24:05.531355 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6036f981ebd6c579ee56dc0d86ccc670bd0f034d6f5e44c0e399e2f2e2d0fff" Mar 20 14:24:05 crc kubenswrapper[4856]: I0320 14:24:05.531439 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-plgqk" Mar 20 14:24:05 crc kubenswrapper[4856]: I0320 14:24:05.879900 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-9kk29"] Mar 20 14:24:05 crc kubenswrapper[4856]: I0320 14:24:05.886170 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-9kk29"] Mar 20 14:24:07 crc kubenswrapper[4856]: I0320 14:24:07.839058 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737aaad0-34bc-44c8-aa37-b49e938af62a" path="/var/lib/kubelet/pods/737aaad0-34bc-44c8-aa37-b49e938af62a/volumes" Mar 20 14:24:09 crc kubenswrapper[4856]: I0320 14:24:09.988064 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:24:09 crc kubenswrapper[4856]: I0320 14:24:09.988150 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.810565 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:24 crc kubenswrapper[4856]: E0320 14:24:24.812381 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d18e4fe0-a1b7-48bc-843b-94faa5785712" containerName="oc" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.812400 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d18e4fe0-a1b7-48bc-843b-94faa5785712" containerName="oc" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.812816 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d18e4fe0-a1b7-48bc-843b-94faa5785712" containerName="oc" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.814146 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.839360 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.873546 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.873737 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.873985 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcd64\" (UniqueName: \"kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.976700 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.976781 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcd64\" (UniqueName: \"kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.976837 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.977197 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.977256 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:24 crc kubenswrapper[4856]: I0320 14:24:24.998009 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcd64\" (UniqueName: \"kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64\") pod \"redhat-marketplace-wc729\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:25 crc kubenswrapper[4856]: I0320 14:24:25.138819 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:25 crc kubenswrapper[4856]: I0320 14:24:25.584202 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:25 crc kubenswrapper[4856]: I0320 14:24:25.693631 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerStarted","Data":"c5263533052053a96cf8cbdaff42686d00b9f047310a141c0cc21463c20c394d"} Mar 20 14:24:26 crc kubenswrapper[4856]: I0320 14:24:26.704451 4856 generic.go:334] "Generic (PLEG): container finished" podID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerID="4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7" exitCode=0 Mar 20 14:24:26 crc kubenswrapper[4856]: I0320 14:24:26.704511 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerDied","Data":"4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7"} Mar 20 14:24:29 crc kubenswrapper[4856]: I0320 14:24:29.729674 4856 generic.go:334] "Generic (PLEG): container finished" podID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerID="a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65" exitCode=0 Mar 20 14:24:29 crc kubenswrapper[4856]: I0320 14:24:29.729843 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerDied","Data":"a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65"} Mar 20 14:24:30 crc kubenswrapper[4856]: I0320 14:24:30.738915 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerStarted","Data":"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c"} Mar 20 14:24:30 crc kubenswrapper[4856]: I0320 14:24:30.762719 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wc729" podStartSLOduration=3.291591683 podStartE2EDuration="6.762693398s" podCreationTimestamp="2026-03-20 14:24:24 +0000 UTC" firstStartedPulling="2026-03-20 14:24:26.709425146 +0000 UTC m=+3681.590451276" lastFinishedPulling="2026-03-20 14:24:30.180526851 +0000 UTC m=+3685.061552991" observedRunningTime="2026-03-20 14:24:30.758136233 +0000 UTC m=+3685.639162413" watchObservedRunningTime="2026-03-20 14:24:30.762693398 +0000 UTC m=+3685.643719528" Mar 20 14:24:34 crc kubenswrapper[4856]: I0320 14:24:34.720046 4856 scope.go:117] "RemoveContainer" containerID="edf9ddbcfed6996f2009b404c0247671b6b05aa3d779fc8cff96206af9d1abe0" Mar 20 14:24:35 crc kubenswrapper[4856]: I0320 14:24:35.140097 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:35 crc kubenswrapper[4856]: I0320 14:24:35.140816 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:35 crc kubenswrapper[4856]: I0320 14:24:35.187599 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:35 crc kubenswrapper[4856]: I0320 14:24:35.845231 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:35 crc kubenswrapper[4856]: I0320 14:24:35.895625 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:37 crc kubenswrapper[4856]: I0320 14:24:37.805527 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wc729" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="registry-server" containerID="cri-o://a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c" gracePeriod=2 Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.213282 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.282447 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities\") pod \"e4801c4d-04d2-4965-92d2-0b986f5066e0\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.282596 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content\") pod \"e4801c4d-04d2-4965-92d2-0b986f5066e0\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.282692 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcd64\" (UniqueName: \"kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64\") pod \"e4801c4d-04d2-4965-92d2-0b986f5066e0\" (UID: \"e4801c4d-04d2-4965-92d2-0b986f5066e0\") " Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.283901 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities" (OuterVolumeSpecName: "utilities") pod "e4801c4d-04d2-4965-92d2-0b986f5066e0" (UID: "e4801c4d-04d2-4965-92d2-0b986f5066e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.292177 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64" (OuterVolumeSpecName: "kube-api-access-hcd64") pod "e4801c4d-04d2-4965-92d2-0b986f5066e0" (UID: "e4801c4d-04d2-4965-92d2-0b986f5066e0"). InnerVolumeSpecName "kube-api-access-hcd64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.317999 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4801c4d-04d2-4965-92d2-0b986f5066e0" (UID: "e4801c4d-04d2-4965-92d2-0b986f5066e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.385000 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.385038 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4801c4d-04d2-4965-92d2-0b986f5066e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.385050 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcd64\" (UniqueName: \"kubernetes.io/projected/e4801c4d-04d2-4965-92d2-0b986f5066e0-kube-api-access-hcd64\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.815421 4856 generic.go:334] "Generic (PLEG): container finished" podID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerID="a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c" exitCode=0 Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.815476 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerDied","Data":"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c"} Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.815518 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wc729" event={"ID":"e4801c4d-04d2-4965-92d2-0b986f5066e0","Type":"ContainerDied","Data":"c5263533052053a96cf8cbdaff42686d00b9f047310a141c0cc21463c20c394d"} Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.815543 4856 scope.go:117] "RemoveContainer" containerID="a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.815552 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wc729" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.848015 4856 scope.go:117] "RemoveContainer" containerID="a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.858730 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.866650 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wc729"] Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.874164 4856 scope.go:117] "RemoveContainer" containerID="4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.900982 4856 scope.go:117] "RemoveContainer" containerID="a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c" Mar 20 14:24:38 crc kubenswrapper[4856]: E0320 14:24:38.901612 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c\": container with ID starting with a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c not found: ID does not exist" containerID="a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.901668 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c"} err="failed to get container status \"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c\": rpc error: code = NotFound desc = could not find container \"a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c\": container with ID starting with a64e938457e280ad0278f5da93b71d84d9a293aeb51fc6b9da81848ce7d49e9c not found: ID does not exist" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.901702 4856 scope.go:117] "RemoveContainer" containerID="a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65" Mar 20 14:24:38 crc kubenswrapper[4856]: E0320 14:24:38.902175 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65\": container with ID starting with a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65 not found: ID does not exist" containerID="a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.902207 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65"} err="failed to get container status \"a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65\": rpc error: code = NotFound desc = could not find container \"a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65\": container with ID starting with a161d2737cd4f108e2c68d7f2321b6a8a6441ea5b519a37e2420c66d82270b65 not found: ID does not exist" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.902232 4856 scope.go:117] "RemoveContainer" containerID="4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7" Mar 20 14:24:38 crc kubenswrapper[4856]: E0320 14:24:38.902601 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7\": container with ID starting with 4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7 not found: ID does not exist" containerID="4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7" Mar 20 14:24:38 crc kubenswrapper[4856]: I0320 14:24:38.902625 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7"} err="failed to get container status \"4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7\": rpc error: code = NotFound desc = could not find container \"4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7\": container with ID starting with 4acccfde43dc5f86cdf24f869ddc093d61c2ab184956f3eaa6724bf53b488dd7 not found: ID does not exist" Mar 20 14:24:39 crc kubenswrapper[4856]: I0320 14:24:39.829479 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" path="/var/lib/kubelet/pods/e4801c4d-04d2-4965-92d2-0b986f5066e0/volumes" Mar 20 14:24:39 crc kubenswrapper[4856]: I0320 14:24:39.987968 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:24:39 crc kubenswrapper[4856]: I0320 14:24:39.988065 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:25:09 crc kubenswrapper[4856]: I0320 14:25:09.987970 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:25:09 crc kubenswrapper[4856]: I0320 14:25:09.988537 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:25:09 crc kubenswrapper[4856]: I0320 14:25:09.988591 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:25:09 crc kubenswrapper[4856]: I0320 14:25:09.989222 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:25:09 crc kubenswrapper[4856]: I0320 14:25:09.989305 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f" gracePeriod=600 Mar 20 14:25:11 crc kubenswrapper[4856]: I0320 14:25:11.042020 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f" exitCode=0 Mar 20 14:25:11 crc kubenswrapper[4856]: I0320 14:25:11.042166 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f"} Mar 20 14:25:11 crc kubenswrapper[4856]: I0320 14:25:11.042996 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1"} Mar 20 14:25:11 crc kubenswrapper[4856]: I0320 14:25:11.043027 4856 scope.go:117] "RemoveContainer" containerID="9ef6c50404fe9099f921fbcb707c1948fdfb5fdc858b30d115395e93e72ebe9d" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.135403 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566946-68bcb"] Mar 20 14:26:00 crc kubenswrapper[4856]: E0320 14:26:00.136302 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="registry-server" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.136318 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="registry-server" Mar 20 14:26:00 crc kubenswrapper[4856]: E0320 14:26:00.136359 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="extract-utilities" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.136368 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="extract-utilities" Mar 20 14:26:00 crc kubenswrapper[4856]: E0320 14:26:00.136378 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="extract-content" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.136385 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="extract-content" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.136564 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4801c4d-04d2-4965-92d2-0b986f5066e0" containerName="registry-server" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.137108 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.138983 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.139937 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.139970 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.143312 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-68bcb"] Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.290513 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q222b\" (UniqueName: \"kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b\") pod \"auto-csr-approver-29566946-68bcb\" (UID: \"fa602330-72bc-4a6a-9d88-74a383941fd3\") " pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.392330 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q222b\" (UniqueName: \"kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b\") pod \"auto-csr-approver-29566946-68bcb\" (UID: \"fa602330-72bc-4a6a-9d88-74a383941fd3\") " pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.422671 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q222b\" (UniqueName: \"kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b\") pod \"auto-csr-approver-29566946-68bcb\" (UID: \"fa602330-72bc-4a6a-9d88-74a383941fd3\") " pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.458739 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:00 crc kubenswrapper[4856]: I0320 14:26:00.861550 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-68bcb"] Mar 20 14:26:01 crc kubenswrapper[4856]: I0320 14:26:01.474555 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-68bcb" event={"ID":"fa602330-72bc-4a6a-9d88-74a383941fd3","Type":"ContainerStarted","Data":"fac9ef8345f333fa8b31a6c3536b11a39178443b28de992c2b029d15403e2511"} Mar 20 14:26:02 crc kubenswrapper[4856]: I0320 14:26:02.483036 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-68bcb" event={"ID":"fa602330-72bc-4a6a-9d88-74a383941fd3","Type":"ContainerStarted","Data":"fe5b79b849a5f23e8168332d6c7562860e647026d1346645443f369b1061b0a4"} Mar 20 14:26:02 crc kubenswrapper[4856]: I0320 14:26:02.499953 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566946-68bcb" podStartSLOduration=1.435135166 podStartE2EDuration="2.499937106s" podCreationTimestamp="2026-03-20 14:26:00 +0000 UTC" firstStartedPulling="2026-03-20 14:26:00.870356907 +0000 UTC m=+3775.751383037" lastFinishedPulling="2026-03-20 14:26:01.935158857 +0000 UTC m=+3776.816184977" observedRunningTime="2026-03-20 14:26:02.496693318 +0000 UTC m=+3777.377719468" watchObservedRunningTime="2026-03-20 14:26:02.499937106 +0000 UTC m=+3777.380963236" Mar 20 14:26:03 crc kubenswrapper[4856]: I0320 14:26:03.492501 4856 generic.go:334] "Generic (PLEG): container finished" podID="fa602330-72bc-4a6a-9d88-74a383941fd3" containerID="fe5b79b849a5f23e8168332d6c7562860e647026d1346645443f369b1061b0a4" exitCode=0 Mar 20 14:26:03 crc kubenswrapper[4856]: I0320 14:26:03.492544 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-68bcb" event={"ID":"fa602330-72bc-4a6a-9d88-74a383941fd3","Type":"ContainerDied","Data":"fe5b79b849a5f23e8168332d6c7562860e647026d1346645443f369b1061b0a4"} Mar 20 14:26:04 crc kubenswrapper[4856]: I0320 14:26:04.790519 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:04 crc kubenswrapper[4856]: I0320 14:26:04.948881 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q222b\" (UniqueName: \"kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b\") pod \"fa602330-72bc-4a6a-9d88-74a383941fd3\" (UID: \"fa602330-72bc-4a6a-9d88-74a383941fd3\") " Mar 20 14:26:04 crc kubenswrapper[4856]: I0320 14:26:04.955664 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b" (OuterVolumeSpecName: "kube-api-access-q222b") pod "fa602330-72bc-4a6a-9d88-74a383941fd3" (UID: "fa602330-72bc-4a6a-9d88-74a383941fd3"). InnerVolumeSpecName "kube-api-access-q222b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.050775 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q222b\" (UniqueName: \"kubernetes.io/projected/fa602330-72bc-4a6a-9d88-74a383941fd3-kube-api-access-q222b\") on node \"crc\" DevicePath \"\"" Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.505577 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-68bcb" event={"ID":"fa602330-72bc-4a6a-9d88-74a383941fd3","Type":"ContainerDied","Data":"fac9ef8345f333fa8b31a6c3536b11a39178443b28de992c2b029d15403e2511"} Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.505948 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fac9ef8345f333fa8b31a6c3536b11a39178443b28de992c2b029d15403e2511" Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.505646 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-68bcb" Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.556975 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-86spd"] Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.563631 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-86spd"] Mar 20 14:26:05 crc kubenswrapper[4856]: I0320 14:26:05.833530 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66ec89b-f68a-4018-958a-746d7b1b3d11" path="/var/lib/kubelet/pods/f66ec89b-f68a-4018-958a-746d7b1b3d11/volumes" Mar 20 14:26:34 crc kubenswrapper[4856]: I0320 14:26:34.826218 4856 scope.go:117] "RemoveContainer" containerID="a4c2ad1223c0ed0f566f4e1d0328a497c7be9119f719a070f60c2b073021ab17" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.846149 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:03 crc kubenswrapper[4856]: E0320 14:27:03.846960 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa602330-72bc-4a6a-9d88-74a383941fd3" containerName="oc" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.846974 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa602330-72bc-4a6a-9d88-74a383941fd3" containerName="oc" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.847143 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa602330-72bc-4a6a-9d88-74a383941fd3" containerName="oc" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.848197 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.848301 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.926230 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.926334 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k66xd\" (UniqueName: \"kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:03 crc kubenswrapper[4856]: I0320 14:27:03.926365 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.027191 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.027262 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k66xd\" (UniqueName: \"kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.027300 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.027727 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.027812 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.046970 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k66xd\" (UniqueName: \"kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd\") pod \"community-operators-mmr89\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.185622 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:04 crc kubenswrapper[4856]: I0320 14:27:04.674799 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:05 crc kubenswrapper[4856]: I0320 14:27:05.154442 4856 generic.go:334] "Generic (PLEG): container finished" podID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerID="5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169" exitCode=0 Mar 20 14:27:05 crc kubenswrapper[4856]: I0320 14:27:05.154533 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerDied","Data":"5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169"} Mar 20 14:27:05 crc kubenswrapper[4856]: I0320 14:27:05.154628 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerStarted","Data":"3c680d5e365a210825bc9b508a669793126eb6ad5df8443aedb182615d4bcdda"} Mar 20 14:27:06 crc kubenswrapper[4856]: I0320 14:27:06.164853 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerStarted","Data":"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6"} Mar 20 14:27:07 crc kubenswrapper[4856]: I0320 14:27:07.175784 4856 generic.go:334] "Generic (PLEG): container finished" podID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerID="379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6" exitCode=0 Mar 20 14:27:07 crc kubenswrapper[4856]: I0320 14:27:07.175835 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerDied","Data":"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6"} Mar 20 14:27:08 crc kubenswrapper[4856]: I0320 14:27:08.185444 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerStarted","Data":"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a"} Mar 20 14:27:08 crc kubenswrapper[4856]: I0320 14:27:08.208525 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mmr89" podStartSLOduration=2.80487198 podStartE2EDuration="5.208504702s" podCreationTimestamp="2026-03-20 14:27:03 +0000 UTC" firstStartedPulling="2026-03-20 14:27:05.156901848 +0000 UTC m=+3840.037927998" lastFinishedPulling="2026-03-20 14:27:07.56053455 +0000 UTC m=+3842.441560720" observedRunningTime="2026-03-20 14:27:08.201925841 +0000 UTC m=+3843.082951961" watchObservedRunningTime="2026-03-20 14:27:08.208504702 +0000 UTC m=+3843.089530832" Mar 20 14:27:14 crc kubenswrapper[4856]: I0320 14:27:14.186730 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:14 crc kubenswrapper[4856]: I0320 14:27:14.187064 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:14 crc kubenswrapper[4856]: I0320 14:27:14.245761 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:15 crc kubenswrapper[4856]: I0320 14:27:15.293448 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:15 crc kubenswrapper[4856]: I0320 14:27:15.348939 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.266659 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mmr89" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="registry-server" containerID="cri-o://ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a" gracePeriod=2 Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.631934 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.745817 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities\") pod \"a0a76db4-35a2-4b17-b7df-68eccb19d448\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.745957 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content\") pod \"a0a76db4-35a2-4b17-b7df-68eccb19d448\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.746012 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k66xd\" (UniqueName: \"kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd\") pod \"a0a76db4-35a2-4b17-b7df-68eccb19d448\" (UID: \"a0a76db4-35a2-4b17-b7df-68eccb19d448\") " Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.748810 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities" (OuterVolumeSpecName: "utilities") pod "a0a76db4-35a2-4b17-b7df-68eccb19d448" (UID: "a0a76db4-35a2-4b17-b7df-68eccb19d448"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.754112 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd" (OuterVolumeSpecName: "kube-api-access-k66xd") pod "a0a76db4-35a2-4b17-b7df-68eccb19d448" (UID: "a0a76db4-35a2-4b17-b7df-68eccb19d448"). InnerVolumeSpecName "kube-api-access-k66xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.812661 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0a76db4-35a2-4b17-b7df-68eccb19d448" (UID: "a0a76db4-35a2-4b17-b7df-68eccb19d448"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.848397 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.848477 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k66xd\" (UniqueName: \"kubernetes.io/projected/a0a76db4-35a2-4b17-b7df-68eccb19d448-kube-api-access-k66xd\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:17 crc kubenswrapper[4856]: I0320 14:27:17.848494 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0a76db4-35a2-4b17-b7df-68eccb19d448-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.279505 4856 generic.go:334] "Generic (PLEG): container finished" podID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerID="ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a" exitCode=0 Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.279565 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerDied","Data":"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a"} Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.279570 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mmr89" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.279607 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mmr89" event={"ID":"a0a76db4-35a2-4b17-b7df-68eccb19d448","Type":"ContainerDied","Data":"3c680d5e365a210825bc9b508a669793126eb6ad5df8443aedb182615d4bcdda"} Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.279628 4856 scope.go:117] "RemoveContainer" containerID="ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.316106 4856 scope.go:117] "RemoveContainer" containerID="379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.329896 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.337717 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mmr89"] Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.348996 4856 scope.go:117] "RemoveContainer" containerID="5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.377712 4856 scope.go:117] "RemoveContainer" containerID="ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a" Mar 20 14:27:18 crc kubenswrapper[4856]: E0320 14:27:18.378149 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a\": container with ID starting with ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a not found: ID does not exist" containerID="ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.378194 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a"} err="failed to get container status \"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a\": rpc error: code = NotFound desc = could not find container \"ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a\": container with ID starting with ea150614c19b125bd49110e50db6b2d437749877e04667fe1e8d49ea741e439a not found: ID does not exist" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.378222 4856 scope.go:117] "RemoveContainer" containerID="379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6" Mar 20 14:27:18 crc kubenswrapper[4856]: E0320 14:27:18.378579 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6\": container with ID starting with 379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6 not found: ID does not exist" containerID="379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.378610 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6"} err="failed to get container status \"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6\": rpc error: code = NotFound desc = could not find container \"379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6\": container with ID starting with 379817d567e32f0465a1622b4921505ec1bd1c73f8048c8401c6a4c24c29bce6 not found: ID does not exist" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.378638 4856 scope.go:117] "RemoveContainer" containerID="5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169" Mar 20 14:27:18 crc kubenswrapper[4856]: E0320 14:27:18.378848 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169\": container with ID starting with 5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169 not found: ID does not exist" containerID="5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169" Mar 20 14:27:18 crc kubenswrapper[4856]: I0320 14:27:18.378876 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169"} err="failed to get container status \"5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169\": rpc error: code = NotFound desc = could not find container \"5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169\": container with ID starting with 5dfc6ac7d63567e1e9fa45533976526937c0a3ab897b59aa9d0c9e7eed1b6169 not found: ID does not exist" Mar 20 14:27:19 crc kubenswrapper[4856]: I0320 14:27:19.827115 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" path="/var/lib/kubelet/pods/a0a76db4-35a2-4b17-b7df-68eccb19d448/volumes" Mar 20 14:27:39 crc kubenswrapper[4856]: I0320 14:27:39.987118 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:27:39 crc kubenswrapper[4856]: I0320 14:27:39.987640 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.142126 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566948-wkcpr"] Mar 20 14:28:00 crc kubenswrapper[4856]: E0320 14:28:00.143052 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="extract-utilities" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.143066 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="extract-utilities" Mar 20 14:28:00 crc kubenswrapper[4856]: E0320 14:28:00.143094 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="registry-server" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.143101 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="registry-server" Mar 20 14:28:00 crc kubenswrapper[4856]: E0320 14:28:00.143109 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="extract-content" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.143116 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="extract-content" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.143430 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a76db4-35a2-4b17-b7df-68eccb19d448" containerName="registry-server" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.145955 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.151489 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.151846 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-wkcpr"] Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.151971 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.152032 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.194923 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bqg8\" (UniqueName: \"kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8\") pod \"auto-csr-approver-29566948-wkcpr\" (UID: \"3f81b3d8-e3f8-4519-b669-d20a7371ac14\") " pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.296926 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bqg8\" (UniqueName: \"kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8\") pod \"auto-csr-approver-29566948-wkcpr\" (UID: \"3f81b3d8-e3f8-4519-b669-d20a7371ac14\") " pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.316150 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bqg8\" (UniqueName: \"kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8\") pod \"auto-csr-approver-29566948-wkcpr\" (UID: \"3f81b3d8-e3f8-4519-b669-d20a7371ac14\") " pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.463742 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:00 crc kubenswrapper[4856]: I0320 14:28:00.857562 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-wkcpr"] Mar 20 14:28:01 crc kubenswrapper[4856]: I0320 14:28:01.611748 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" event={"ID":"3f81b3d8-e3f8-4519-b669-d20a7371ac14","Type":"ContainerStarted","Data":"c0890cbbab40676932de65382aa1eb90ceb828f191820bac861fdcbb1a3e4c3d"} Mar 20 14:28:04 crc kubenswrapper[4856]: I0320 14:28:04.636248 4856 generic.go:334] "Generic (PLEG): container finished" podID="3f81b3d8-e3f8-4519-b669-d20a7371ac14" containerID="9f50da79dcc154686d49c67a68f6846b029264d59e3aec2d30e8c084252ea40c" exitCode=0 Mar 20 14:28:04 crc kubenswrapper[4856]: I0320 14:28:04.636582 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" event={"ID":"3f81b3d8-e3f8-4519-b669-d20a7371ac14","Type":"ContainerDied","Data":"9f50da79dcc154686d49c67a68f6846b029264d59e3aec2d30e8c084252ea40c"} Mar 20 14:28:05 crc kubenswrapper[4856]: I0320 14:28:05.906517 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:05 crc kubenswrapper[4856]: I0320 14:28:05.970637 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bqg8\" (UniqueName: \"kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8\") pod \"3f81b3d8-e3f8-4519-b669-d20a7371ac14\" (UID: \"3f81b3d8-e3f8-4519-b669-d20a7371ac14\") " Mar 20 14:28:05 crc kubenswrapper[4856]: I0320 14:28:05.976629 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8" (OuterVolumeSpecName: "kube-api-access-8bqg8") pod "3f81b3d8-e3f8-4519-b669-d20a7371ac14" (UID: "3f81b3d8-e3f8-4519-b669-d20a7371ac14"). InnerVolumeSpecName "kube-api-access-8bqg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.071916 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bqg8\" (UniqueName: \"kubernetes.io/projected/3f81b3d8-e3f8-4519-b669-d20a7371ac14-kube-api-access-8bqg8\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.652990 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" event={"ID":"3f81b3d8-e3f8-4519-b669-d20a7371ac14","Type":"ContainerDied","Data":"c0890cbbab40676932de65382aa1eb90ceb828f191820bac861fdcbb1a3e4c3d"} Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.653027 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0890cbbab40676932de65382aa1eb90ceb828f191820bac861fdcbb1a3e4c3d" Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.653037 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-wkcpr" Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.982755 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-2j49n"] Mar 20 14:28:06 crc kubenswrapper[4856]: I0320 14:28:06.990409 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-2j49n"] Mar 20 14:28:07 crc kubenswrapper[4856]: I0320 14:28:07.830254 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c88e41-149c-4ad7-9b82-94979ec6ceeb" path="/var/lib/kubelet/pods/13c88e41-149c-4ad7-9b82-94979ec6ceeb/volumes" Mar 20 14:28:09 crc kubenswrapper[4856]: I0320 14:28:09.987693 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:28:09 crc kubenswrapper[4856]: I0320 14:28:09.987968 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:28:34 crc kubenswrapper[4856]: I0320 14:28:34.921026 4856 scope.go:117] "RemoveContainer" containerID="af002cb82aa149c9a4d5b242f018c1ca6bce5ecfb59b9e5e7bba661eca5f28ab" Mar 20 14:28:39 crc kubenswrapper[4856]: I0320 14:28:39.988182 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:28:39 crc kubenswrapper[4856]: I0320 14:28:39.988661 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:28:39 crc kubenswrapper[4856]: I0320 14:28:39.988710 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:28:39 crc kubenswrapper[4856]: I0320 14:28:39.989308 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:28:39 crc kubenswrapper[4856]: I0320 14:28:39.989351 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" gracePeriod=600 Mar 20 14:28:40 crc kubenswrapper[4856]: E0320 14:28:40.146506 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:28:41 crc kubenswrapper[4856]: I0320 14:28:41.020321 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" exitCode=0 Mar 20 14:28:41 crc kubenswrapper[4856]: I0320 14:28:41.020412 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1"} Mar 20 14:28:41 crc kubenswrapper[4856]: I0320 14:28:41.020825 4856 scope.go:117] "RemoveContainer" containerID="1a0d2859954ebd8a883b994afd2985705b0ed1c4522dff0cfc6b92648928bc0f" Mar 20 14:28:41 crc kubenswrapper[4856]: I0320 14:28:41.021285 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:28:41 crc kubenswrapper[4856]: E0320 14:28:41.021622 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:28:54 crc kubenswrapper[4856]: I0320 14:28:54.819622 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:28:54 crc kubenswrapper[4856]: E0320 14:28:54.821196 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:29:06 crc kubenswrapper[4856]: I0320 14:29:06.820358 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:29:06 crc kubenswrapper[4856]: E0320 14:29:06.821126 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:29:17 crc kubenswrapper[4856]: I0320 14:29:17.820057 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:29:17 crc kubenswrapper[4856]: E0320 14:29:17.820590 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:29:31 crc kubenswrapper[4856]: I0320 14:29:31.820904 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:29:31 crc kubenswrapper[4856]: E0320 14:29:31.822046 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:29:44 crc kubenswrapper[4856]: I0320 14:29:44.820889 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:29:44 crc kubenswrapper[4856]: E0320 14:29:44.822070 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:29:57 crc kubenswrapper[4856]: I0320 14:29:57.819741 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:29:57 crc kubenswrapper[4856]: E0320 14:29:57.820845 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.156193 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566950-wsqbw"] Mar 20 14:30:00 crc kubenswrapper[4856]: E0320 14:30:00.156953 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f81b3d8-e3f8-4519-b669-d20a7371ac14" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.156974 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f81b3d8-e3f8-4519-b669-d20a7371ac14" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.157224 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f81b3d8-e3f8-4519-b669-d20a7371ac14" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.157849 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.160390 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.160557 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.160473 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.168163 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc"] Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.169132 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.178092 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.179390 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.186805 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc"] Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.209244 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-wsqbw"] Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.292618 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.292719 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.292748 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4dx\" (UniqueName: \"kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx\") pod \"auto-csr-approver-29566950-wsqbw\" (UID: \"76260dae-e420-469d-9b2b-97daf5635aa9\") " pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.292904 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27sds\" (UniqueName: \"kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.393980 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.394063 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.394089 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4dx\" (UniqueName: \"kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx\") pod \"auto-csr-approver-29566950-wsqbw\" (UID: \"76260dae-e420-469d-9b2b-97daf5635aa9\") " pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.394123 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27sds\" (UniqueName: \"kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.395447 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.401566 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.412826 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27sds\" (UniqueName: \"kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds\") pod \"collect-profiles-29566950-tflhc\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.413545 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4dx\" (UniqueName: \"kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx\") pod \"auto-csr-approver-29566950-wsqbw\" (UID: \"76260dae-e420-469d-9b2b-97daf5635aa9\") " pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.483147 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.495303 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.930730 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc"] Mar 20 14:30:00 crc kubenswrapper[4856]: W0320 14:30:00.934502 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57d8289b_7203_469a_aa61_0bb7f87f5e3f.slice/crio-7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72 WatchSource:0}: Error finding container 7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72: Status 404 returned error can't find the container with id 7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72 Mar 20 14:30:00 crc kubenswrapper[4856]: I0320 14:30:00.997126 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-wsqbw"] Mar 20 14:30:01 crc kubenswrapper[4856]: W0320 14:30:01.011061 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76260dae_e420_469d_9b2b_97daf5635aa9.slice/crio-b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd WatchSource:0}: Error finding container b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd: Status 404 returned error can't find the container with id b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd Mar 20 14:30:01 crc kubenswrapper[4856]: I0320 14:30:01.013663 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:30:01 crc kubenswrapper[4856]: I0320 14:30:01.561887 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" event={"ID":"76260dae-e420-469d-9b2b-97daf5635aa9","Type":"ContainerStarted","Data":"b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd"} Mar 20 14:30:01 crc kubenswrapper[4856]: I0320 14:30:01.563496 4856 generic.go:334] "Generic (PLEG): container finished" podID="57d8289b-7203-469a-aa61-0bb7f87f5e3f" containerID="cca8678929ae6c204e085af0fe83381a9e9eb287fca02cf580164035a9d12cd4" exitCode=0 Mar 20 14:30:01 crc kubenswrapper[4856]: I0320 14:30:01.563538 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" event={"ID":"57d8289b-7203-469a-aa61-0bb7f87f5e3f","Type":"ContainerDied","Data":"cca8678929ae6c204e085af0fe83381a9e9eb287fca02cf580164035a9d12cd4"} Mar 20 14:30:01 crc kubenswrapper[4856]: I0320 14:30:01.563557 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" event={"ID":"57d8289b-7203-469a-aa61-0bb7f87f5e3f","Type":"ContainerStarted","Data":"7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72"} Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.118818 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.239894 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume\") pod \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.240406 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27sds\" (UniqueName: \"kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds\") pod \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.240448 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume\") pod \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\" (UID: \"57d8289b-7203-469a-aa61-0bb7f87f5e3f\") " Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.240715 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume" (OuterVolumeSpecName: "config-volume") pod "57d8289b-7203-469a-aa61-0bb7f87f5e3f" (UID: "57d8289b-7203-469a-aa61-0bb7f87f5e3f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.245261 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds" (OuterVolumeSpecName: "kube-api-access-27sds") pod "57d8289b-7203-469a-aa61-0bb7f87f5e3f" (UID: "57d8289b-7203-469a-aa61-0bb7f87f5e3f"). InnerVolumeSpecName "kube-api-access-27sds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.249797 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "57d8289b-7203-469a-aa61-0bb7f87f5e3f" (UID: "57d8289b-7203-469a-aa61-0bb7f87f5e3f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.342202 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27sds\" (UniqueName: \"kubernetes.io/projected/57d8289b-7203-469a-aa61-0bb7f87f5e3f-kube-api-access-27sds\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.342256 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/57d8289b-7203-469a-aa61-0bb7f87f5e3f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.342282 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/57d8289b-7203-469a-aa61-0bb7f87f5e3f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.577021 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" event={"ID":"57d8289b-7203-469a-aa61-0bb7f87f5e3f","Type":"ContainerDied","Data":"7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72"} Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.577058 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a088242299899953ef8336b987e17ffb6434e96b1d7d8415e5cc855c1cd9f72" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.577085 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-tflhc" Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.579856 4856 generic.go:334] "Generic (PLEG): container finished" podID="76260dae-e420-469d-9b2b-97daf5635aa9" containerID="81606a0f52dab6ebb13e05eab0d532a097d70a81fff24577e555e0fe2ae1b2a9" exitCode=0 Mar 20 14:30:03 crc kubenswrapper[4856]: I0320 14:30:03.579900 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" event={"ID":"76260dae-e420-469d-9b2b-97daf5635aa9","Type":"ContainerDied","Data":"81606a0f52dab6ebb13e05eab0d532a097d70a81fff24577e555e0fe2ae1b2a9"} Mar 20 14:30:04 crc kubenswrapper[4856]: I0320 14:30:04.191170 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8"] Mar 20 14:30:04 crc kubenswrapper[4856]: I0320 14:30:04.196053 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-8hzb8"] Mar 20 14:30:04 crc kubenswrapper[4856]: I0320 14:30:04.931989 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.063441 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx4dx\" (UniqueName: \"kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx\") pod \"76260dae-e420-469d-9b2b-97daf5635aa9\" (UID: \"76260dae-e420-469d-9b2b-97daf5635aa9\") " Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.066938 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx" (OuterVolumeSpecName: "kube-api-access-qx4dx") pod "76260dae-e420-469d-9b2b-97daf5635aa9" (UID: "76260dae-e420-469d-9b2b-97daf5635aa9"). InnerVolumeSpecName "kube-api-access-qx4dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.165406 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx4dx\" (UniqueName: \"kubernetes.io/projected/76260dae-e420-469d-9b2b-97daf5635aa9-kube-api-access-qx4dx\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.594035 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" event={"ID":"76260dae-e420-469d-9b2b-97daf5635aa9","Type":"ContainerDied","Data":"b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd"} Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.594071 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9db3e772ef4bc5e4f958e4770f4133e3a79cc1691161b65f3f7090917a076bd" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.594083 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-wsqbw" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.830138 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ab70ac0-2902-4f84-9142-060f5adee35b" path="/var/lib/kubelet/pods/2ab70ac0-2902-4f84-9142-060f5adee35b/volumes" Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.983080 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-plgqk"] Mar 20 14:30:05 crc kubenswrapper[4856]: I0320 14:30:05.988422 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-plgqk"] Mar 20 14:30:07 crc kubenswrapper[4856]: I0320 14:30:07.829449 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d18e4fe0-a1b7-48bc-843b-94faa5785712" path="/var/lib/kubelet/pods/d18e4fe0-a1b7-48bc-843b-94faa5785712/volumes" Mar 20 14:30:08 crc kubenswrapper[4856]: I0320 14:30:08.820490 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:30:08 crc kubenswrapper[4856]: E0320 14:30:08.821129 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:30:23 crc kubenswrapper[4856]: I0320 14:30:23.819969 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:30:23 crc kubenswrapper[4856]: E0320 14:30:23.820645 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:30:35 crc kubenswrapper[4856]: I0320 14:30:35.003187 4856 scope.go:117] "RemoveContainer" containerID="828b19ce9293f83dee927f31c83f084f54b2d611b4e4aeaa9bfef9388b8895b7" Mar 20 14:30:35 crc kubenswrapper[4856]: I0320 14:30:35.053123 4856 scope.go:117] "RemoveContainer" containerID="4b97f3e939ad017e8453b293bc48ae202eb064df16e3bf1b2a34e44913f6d7c5" Mar 20 14:30:35 crc kubenswrapper[4856]: I0320 14:30:35.824897 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:30:35 crc kubenswrapper[4856]: E0320 14:30:35.825362 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:30:48 crc kubenswrapper[4856]: I0320 14:30:48.820503 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:30:48 crc kubenswrapper[4856]: E0320 14:30:48.821575 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:30:59 crc kubenswrapper[4856]: I0320 14:30:59.819677 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:30:59 crc kubenswrapper[4856]: E0320 14:30:59.820468 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:31:14 crc kubenswrapper[4856]: I0320 14:31:14.819653 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:31:14 crc kubenswrapper[4856]: E0320 14:31:14.821604 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.993829 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:15 crc kubenswrapper[4856]: E0320 14:31:15.994226 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76260dae-e420-469d-9b2b-97daf5635aa9" containerName="oc" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.994243 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="76260dae-e420-469d-9b2b-97daf5635aa9" containerName="oc" Mar 20 14:31:15 crc kubenswrapper[4856]: E0320 14:31:15.994262 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d8289b-7203-469a-aa61-0bb7f87f5e3f" containerName="collect-profiles" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.994289 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d8289b-7203-469a-aa61-0bb7f87f5e3f" containerName="collect-profiles" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.994458 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d8289b-7203-469a-aa61-0bb7f87f5e3f" containerName="collect-profiles" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.994489 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="76260dae-e420-469d-9b2b-97daf5635aa9" containerName="oc" Mar 20 14:31:15 crc kubenswrapper[4856]: I0320 14:31:15.997674 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.006110 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.065243 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.065322 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcnfz\" (UniqueName: \"kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.065378 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.166443 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.166497 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcnfz\" (UniqueName: \"kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.166533 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.167012 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.167104 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.186898 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcnfz\" (UniqueName: \"kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz\") pod \"redhat-operators-lxgcm\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.360745 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:16 crc kubenswrapper[4856]: I0320 14:31:16.804612 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:17 crc kubenswrapper[4856]: I0320 14:31:17.082562 4856 generic.go:334] "Generic (PLEG): container finished" podID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerID="3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594" exitCode=0 Mar 20 14:31:17 crc kubenswrapper[4856]: I0320 14:31:17.082832 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerDied","Data":"3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594"} Mar 20 14:31:17 crc kubenswrapper[4856]: I0320 14:31:17.082979 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerStarted","Data":"01f2b216ec4c51e209b2258abcf3cee66f1c3ae9cf44ee4f8c91536c93e77657"} Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.093209 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerStarted","Data":"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677"} Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.390231 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.391621 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.401085 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9fpj\" (UniqueName: \"kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.401159 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.401217 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.417156 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.502576 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9fpj\" (UniqueName: \"kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.502638 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.502703 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.503392 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.503426 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.523776 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9fpj\" (UniqueName: \"kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj\") pod \"certified-operators-t7d8p\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.715490 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:18 crc kubenswrapper[4856]: I0320 14:31:18.966842 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:19 crc kubenswrapper[4856]: I0320 14:31:19.100923 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerStarted","Data":"edb749def515698c213a8b053361d61b43a35193ac68b8088556ef53ad9c9753"} Mar 20 14:31:19 crc kubenswrapper[4856]: I0320 14:31:19.103309 4856 generic.go:334] "Generic (PLEG): container finished" podID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerID="0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677" exitCode=0 Mar 20 14:31:19 crc kubenswrapper[4856]: I0320 14:31:19.103361 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerDied","Data":"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677"} Mar 20 14:31:20 crc kubenswrapper[4856]: I0320 14:31:20.112175 4856 generic.go:334] "Generic (PLEG): container finished" podID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerID="d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707" exitCode=0 Mar 20 14:31:20 crc kubenswrapper[4856]: I0320 14:31:20.112243 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerDied","Data":"d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707"} Mar 20 14:31:20 crc kubenswrapper[4856]: I0320 14:31:20.117970 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerStarted","Data":"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1"} Mar 20 14:31:20 crc kubenswrapper[4856]: I0320 14:31:20.157727 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lxgcm" podStartSLOduration=2.688525233 podStartE2EDuration="5.157701149s" podCreationTimestamp="2026-03-20 14:31:15 +0000 UTC" firstStartedPulling="2026-03-20 14:31:17.084045351 +0000 UTC m=+4091.965071481" lastFinishedPulling="2026-03-20 14:31:19.553221227 +0000 UTC m=+4094.434247397" observedRunningTime="2026-03-20 14:31:20.151700715 +0000 UTC m=+4095.032726855" watchObservedRunningTime="2026-03-20 14:31:20.157701149 +0000 UTC m=+4095.038727279" Mar 20 14:31:21 crc kubenswrapper[4856]: I0320 14:31:21.126827 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerStarted","Data":"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e"} Mar 20 14:31:22 crc kubenswrapper[4856]: I0320 14:31:22.137488 4856 generic.go:334] "Generic (PLEG): container finished" podID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerID="50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e" exitCode=0 Mar 20 14:31:22 crc kubenswrapper[4856]: I0320 14:31:22.137548 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerDied","Data":"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e"} Mar 20 14:31:23 crc kubenswrapper[4856]: I0320 14:31:23.145837 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerStarted","Data":"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea"} Mar 20 14:31:23 crc kubenswrapper[4856]: I0320 14:31:23.165237 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t7d8p" podStartSLOduration=2.612211022 podStartE2EDuration="5.165219813s" podCreationTimestamp="2026-03-20 14:31:18 +0000 UTC" firstStartedPulling="2026-03-20 14:31:20.114065889 +0000 UTC m=+4094.995092019" lastFinishedPulling="2026-03-20 14:31:22.66707469 +0000 UTC m=+4097.548100810" observedRunningTime="2026-03-20 14:31:23.160726391 +0000 UTC m=+4098.041752541" watchObservedRunningTime="2026-03-20 14:31:23.165219813 +0000 UTC m=+4098.046245953" Mar 20 14:31:26 crc kubenswrapper[4856]: I0320 14:31:26.361251 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:26 crc kubenswrapper[4856]: I0320 14:31:26.361360 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:26 crc kubenswrapper[4856]: I0320 14:31:26.419579 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:26 crc kubenswrapper[4856]: I0320 14:31:26.821290 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:31:26 crc kubenswrapper[4856]: E0320 14:31:26.821965 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:31:27 crc kubenswrapper[4856]: I0320 14:31:27.226805 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:28 crc kubenswrapper[4856]: I0320 14:31:28.378362 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:28 crc kubenswrapper[4856]: I0320 14:31:28.716630 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:28 crc kubenswrapper[4856]: I0320 14:31:28.716687 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:28 crc kubenswrapper[4856]: I0320 14:31:28.767162 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:29 crc kubenswrapper[4856]: I0320 14:31:29.203204 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lxgcm" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="registry-server" containerID="cri-o://3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1" gracePeriod=2 Mar 20 14:31:29 crc kubenswrapper[4856]: I0320 14:31:29.248363 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:30 crc kubenswrapper[4856]: I0320 14:31:30.885882 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.084881 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcnfz\" (UniqueName: \"kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz\") pod \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.085380 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities\") pod \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.085455 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content\") pod \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\" (UID: \"b4eedb91-bafc-4f5d-ad27-e8c93d963436\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.086402 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities" (OuterVolumeSpecName: "utilities") pod "b4eedb91-bafc-4f5d-ad27-e8c93d963436" (UID: "b4eedb91-bafc-4f5d-ad27-e8c93d963436"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.093137 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz" (OuterVolumeSpecName: "kube-api-access-fcnfz") pod "b4eedb91-bafc-4f5d-ad27-e8c93d963436" (UID: "b4eedb91-bafc-4f5d-ad27-e8c93d963436"). InnerVolumeSpecName "kube-api-access-fcnfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.181222 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.187008 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.187055 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcnfz\" (UniqueName: \"kubernetes.io/projected/b4eedb91-bafc-4f5d-ad27-e8c93d963436-kube-api-access-fcnfz\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.218440 4856 generic.go:334] "Generic (PLEG): container finished" podID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerID="3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1" exitCode=0 Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.218682 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t7d8p" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="registry-server" containerID="cri-o://eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea" gracePeriod=2 Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.218805 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lxgcm" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.219108 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerDied","Data":"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1"} Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.219147 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lxgcm" event={"ID":"b4eedb91-bafc-4f5d-ad27-e8c93d963436","Type":"ContainerDied","Data":"01f2b216ec4c51e209b2258abcf3cee66f1c3ae9cf44ee4f8c91536c93e77657"} Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.219168 4856 scope.go:117] "RemoveContainer" containerID="3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.231787 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b4eedb91-bafc-4f5d-ad27-e8c93d963436" (UID: "b4eedb91-bafc-4f5d-ad27-e8c93d963436"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.245390 4856 scope.go:117] "RemoveContainer" containerID="0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.268303 4856 scope.go:117] "RemoveContainer" containerID="3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.288285 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4eedb91-bafc-4f5d-ad27-e8c93d963436-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.304995 4856 scope.go:117] "RemoveContainer" containerID="3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1" Mar 20 14:31:31 crc kubenswrapper[4856]: E0320 14:31:31.306053 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1\": container with ID starting with 3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1 not found: ID does not exist" containerID="3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.306114 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1"} err="failed to get container status \"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1\": rpc error: code = NotFound desc = could not find container \"3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1\": container with ID starting with 3890779148af53fff705a16e206f518e80b18a393d56aeec80a5a666ed3803e1 not found: ID does not exist" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.306155 4856 scope.go:117] "RemoveContainer" containerID="0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677" Mar 20 14:31:31 crc kubenswrapper[4856]: E0320 14:31:31.306586 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677\": container with ID starting with 0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677 not found: ID does not exist" containerID="0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.306619 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677"} err="failed to get container status \"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677\": rpc error: code = NotFound desc = could not find container \"0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677\": container with ID starting with 0f8457d261cd7794f4a49756b07b9511c32226d2132db681b5887635742b3677 not found: ID does not exist" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.306642 4856 scope.go:117] "RemoveContainer" containerID="3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594" Mar 20 14:31:31 crc kubenswrapper[4856]: E0320 14:31:31.307248 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594\": container with ID starting with 3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594 not found: ID does not exist" containerID="3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.307315 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594"} err="failed to get container status \"3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594\": rpc error: code = NotFound desc = could not find container \"3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594\": container with ID starting with 3419bb967d2abf2845884e4127b61f75d63290965b3585be9738e1efe5ecf594 not found: ID does not exist" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.556760 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.561384 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lxgcm"] Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.567737 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.693029 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content\") pod \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.693121 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9fpj\" (UniqueName: \"kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj\") pod \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.693243 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities\") pod \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\" (UID: \"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846\") " Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.694297 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities" (OuterVolumeSpecName: "utilities") pod "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" (UID: "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.698697 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj" (OuterVolumeSpecName: "kube-api-access-m9fpj") pod "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" (UID: "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846"). InnerVolumeSpecName "kube-api-access-m9fpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.794513 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9fpj\" (UniqueName: \"kubernetes.io/projected/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-kube-api-access-m9fpj\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.794555 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:31 crc kubenswrapper[4856]: I0320 14:31:31.831007 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" path="/var/lib/kubelet/pods/b4eedb91-bafc-4f5d-ad27-e8c93d963436/volumes" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.090914 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" (UID: "f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.099337 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.229326 4856 generic.go:334] "Generic (PLEG): container finished" podID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerID="eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea" exitCode=0 Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.229390 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t7d8p" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.229394 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerDied","Data":"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea"} Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.229468 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t7d8p" event={"ID":"f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846","Type":"ContainerDied","Data":"edb749def515698c213a8b053361d61b43a35193ac68b8088556ef53ad9c9753"} Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.229492 4856 scope.go:117] "RemoveContainer" containerID="eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.248038 4856 scope.go:117] "RemoveContainer" containerID="50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.260247 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.264977 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t7d8p"] Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.292684 4856 scope.go:117] "RemoveContainer" containerID="d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.311446 4856 scope.go:117] "RemoveContainer" containerID="eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea" Mar 20 14:31:32 crc kubenswrapper[4856]: E0320 14:31:32.312052 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea\": container with ID starting with eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea not found: ID does not exist" containerID="eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.312101 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea"} err="failed to get container status \"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea\": rpc error: code = NotFound desc = could not find container \"eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea\": container with ID starting with eb27779e67066606719465a66ee4461a27be817a33e298bc2f0b57a41ff8deea not found: ID does not exist" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.312136 4856 scope.go:117] "RemoveContainer" containerID="50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e" Mar 20 14:31:32 crc kubenswrapper[4856]: E0320 14:31:32.312603 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e\": container with ID starting with 50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e not found: ID does not exist" containerID="50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.312652 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e"} err="failed to get container status \"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e\": rpc error: code = NotFound desc = could not find container \"50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e\": container with ID starting with 50c96087896abe930f591a90418482ece71c5c5622bf979d6607e13e711db95e not found: ID does not exist" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.312685 4856 scope.go:117] "RemoveContainer" containerID="d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707" Mar 20 14:31:32 crc kubenswrapper[4856]: E0320 14:31:32.313057 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707\": container with ID starting with d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707 not found: ID does not exist" containerID="d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707" Mar 20 14:31:32 crc kubenswrapper[4856]: I0320 14:31:32.313082 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707"} err="failed to get container status \"d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707\": rpc error: code = NotFound desc = could not find container \"d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707\": container with ID starting with d3388ccf25c73b728188cbacd63b6dca00ba338d88d2926967cd8d4b9ca12707 not found: ID does not exist" Mar 20 14:31:33 crc kubenswrapper[4856]: I0320 14:31:33.835594 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" path="/var/lib/kubelet/pods/f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846/volumes" Mar 20 14:31:40 crc kubenswrapper[4856]: I0320 14:31:40.820060 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:31:40 crc kubenswrapper[4856]: E0320 14:31:40.820559 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:31:55 crc kubenswrapper[4856]: I0320 14:31:55.827410 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:31:55 crc kubenswrapper[4856]: E0320 14:31:55.828374 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.147452 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566952-hqb8s"] Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148160 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148176 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148197 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148207 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148223 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148231 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148264 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148293 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148310 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148320 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4856]: E0320 14:32:00.148343 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148354 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148525 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a5ccdf-11e4-4c2d-a0c8-0ece876ad846" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.148545 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4eedb91-bafc-4f5d-ad27-e8c93d963436" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.149158 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.153664 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.153884 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.154058 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.157018 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-hqb8s"] Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.211525 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tg2r\" (UniqueName: \"kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r\") pod \"auto-csr-approver-29566952-hqb8s\" (UID: \"b473a485-f50e-493b-9e17-5cae0eb5c389\") " pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.312758 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tg2r\" (UniqueName: \"kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r\") pod \"auto-csr-approver-29566952-hqb8s\" (UID: \"b473a485-f50e-493b-9e17-5cae0eb5c389\") " pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.333424 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tg2r\" (UniqueName: \"kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r\") pod \"auto-csr-approver-29566952-hqb8s\" (UID: \"b473a485-f50e-493b-9e17-5cae0eb5c389\") " pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.470942 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:00 crc kubenswrapper[4856]: I0320 14:32:00.902108 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-hqb8s"] Mar 20 14:32:01 crc kubenswrapper[4856]: I0320 14:32:01.624957 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" event={"ID":"b473a485-f50e-493b-9e17-5cae0eb5c389","Type":"ContainerStarted","Data":"50fe20c8a199d6f2fadd23e2afa35d0645501fff02aca47b79e8fa393208d375"} Mar 20 14:32:02 crc kubenswrapper[4856]: I0320 14:32:02.633154 4856 generic.go:334] "Generic (PLEG): container finished" podID="b473a485-f50e-493b-9e17-5cae0eb5c389" containerID="71d2b03f9a7471983355cfb8afa218b1cdb7a4c67d9d390faf8ed9c9fd0d1b3c" exitCode=0 Mar 20 14:32:02 crc kubenswrapper[4856]: I0320 14:32:02.633280 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" event={"ID":"b473a485-f50e-493b-9e17-5cae0eb5c389","Type":"ContainerDied","Data":"71d2b03f9a7471983355cfb8afa218b1cdb7a4c67d9d390faf8ed9c9fd0d1b3c"} Mar 20 14:32:03 crc kubenswrapper[4856]: I0320 14:32:03.944615 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:03 crc kubenswrapper[4856]: I0320 14:32:03.965687 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tg2r\" (UniqueName: \"kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r\") pod \"b473a485-f50e-493b-9e17-5cae0eb5c389\" (UID: \"b473a485-f50e-493b-9e17-5cae0eb5c389\") " Mar 20 14:32:03 crc kubenswrapper[4856]: I0320 14:32:03.971713 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r" (OuterVolumeSpecName: "kube-api-access-2tg2r") pod "b473a485-f50e-493b-9e17-5cae0eb5c389" (UID: "b473a485-f50e-493b-9e17-5cae0eb5c389"). InnerVolumeSpecName "kube-api-access-2tg2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:32:04 crc kubenswrapper[4856]: I0320 14:32:04.067686 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tg2r\" (UniqueName: \"kubernetes.io/projected/b473a485-f50e-493b-9e17-5cae0eb5c389-kube-api-access-2tg2r\") on node \"crc\" DevicePath \"\"" Mar 20 14:32:04 crc kubenswrapper[4856]: I0320 14:32:04.651023 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" event={"ID":"b473a485-f50e-493b-9e17-5cae0eb5c389","Type":"ContainerDied","Data":"50fe20c8a199d6f2fadd23e2afa35d0645501fff02aca47b79e8fa393208d375"} Mar 20 14:32:04 crc kubenswrapper[4856]: I0320 14:32:04.651059 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50fe20c8a199d6f2fadd23e2afa35d0645501fff02aca47b79e8fa393208d375" Mar 20 14:32:04 crc kubenswrapper[4856]: I0320 14:32:04.651122 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-hqb8s" Mar 20 14:32:05 crc kubenswrapper[4856]: I0320 14:32:05.013758 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-68bcb"] Mar 20 14:32:05 crc kubenswrapper[4856]: I0320 14:32:05.021100 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-68bcb"] Mar 20 14:32:05 crc kubenswrapper[4856]: I0320 14:32:05.832106 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa602330-72bc-4a6a-9d88-74a383941fd3" path="/var/lib/kubelet/pods/fa602330-72bc-4a6a-9d88-74a383941fd3/volumes" Mar 20 14:32:10 crc kubenswrapper[4856]: I0320 14:32:10.820255 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:32:10 crc kubenswrapper[4856]: E0320 14:32:10.821080 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:32:24 crc kubenswrapper[4856]: I0320 14:32:24.820617 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:32:24 crc kubenswrapper[4856]: E0320 14:32:24.821452 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:32:35 crc kubenswrapper[4856]: I0320 14:32:35.149963 4856 scope.go:117] "RemoveContainer" containerID="fe5b79b849a5f23e8168332d6c7562860e647026d1346645443f369b1061b0a4" Mar 20 14:32:36 crc kubenswrapper[4856]: I0320 14:32:36.820616 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:32:36 crc kubenswrapper[4856]: E0320 14:32:36.820975 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:32:48 crc kubenswrapper[4856]: I0320 14:32:48.820233 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:32:48 crc kubenswrapper[4856]: E0320 14:32:48.821872 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:33:02 crc kubenswrapper[4856]: I0320 14:33:02.819521 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:33:02 crc kubenswrapper[4856]: E0320 14:33:02.820187 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:33:13 crc kubenswrapper[4856]: I0320 14:33:13.819823 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:33:13 crc kubenswrapper[4856]: E0320 14:33:13.820795 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:33:27 crc kubenswrapper[4856]: I0320 14:33:27.821443 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:33:27 crc kubenswrapper[4856]: E0320 14:33:27.822063 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:33:41 crc kubenswrapper[4856]: I0320 14:33:41.820043 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:33:42 crc kubenswrapper[4856]: I0320 14:33:42.815228 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d"} Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.135736 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566954-zhgqf"] Mar 20 14:34:00 crc kubenswrapper[4856]: E0320 14:34:00.136771 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b473a485-f50e-493b-9e17-5cae0eb5c389" containerName="oc" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.136789 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b473a485-f50e-493b-9e17-5cae0eb5c389" containerName="oc" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.136960 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b473a485-f50e-493b-9e17-5cae0eb5c389" containerName="oc" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.138050 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.140603 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.140696 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.142912 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-zhgqf"] Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.143505 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.255123 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d625j\" (UniqueName: \"kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j\") pod \"auto-csr-approver-29566954-zhgqf\" (UID: \"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43\") " pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.356331 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d625j\" (UniqueName: \"kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j\") pod \"auto-csr-approver-29566954-zhgqf\" (UID: \"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43\") " pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.378500 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d625j\" (UniqueName: \"kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j\") pod \"auto-csr-approver-29566954-zhgqf\" (UID: \"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43\") " pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.463040 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.905465 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-zhgqf"] Mar 20 14:34:00 crc kubenswrapper[4856]: I0320 14:34:00.942103 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" event={"ID":"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43","Type":"ContainerStarted","Data":"83e3d9f2ca39fc1cf1e5a1ae11d91c253e40acb0099f88862bf06391a64c2d44"} Mar 20 14:34:02 crc kubenswrapper[4856]: I0320 14:34:02.958212 4856 generic.go:334] "Generic (PLEG): container finished" podID="5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" containerID="49822c57c00fbb79a45bfac30ad0ac460af29d37a1242b9d0581605713a4434e" exitCode=0 Mar 20 14:34:02 crc kubenswrapper[4856]: I0320 14:34:02.958542 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" event={"ID":"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43","Type":"ContainerDied","Data":"49822c57c00fbb79a45bfac30ad0ac460af29d37a1242b9d0581605713a4434e"} Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.253194 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.415146 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d625j\" (UniqueName: \"kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j\") pod \"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43\" (UID: \"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43\") " Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.422827 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j" (OuterVolumeSpecName: "kube-api-access-d625j") pod "5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" (UID: "5c6bef62-76c2-4908-a28e-7c2a2d9f9c43"). InnerVolumeSpecName "kube-api-access-d625j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.516639 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d625j\" (UniqueName: \"kubernetes.io/projected/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43-kube-api-access-d625j\") on node \"crc\" DevicePath \"\"" Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.977421 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" event={"ID":"5c6bef62-76c2-4908-a28e-7c2a2d9f9c43","Type":"ContainerDied","Data":"83e3d9f2ca39fc1cf1e5a1ae11d91c253e40acb0099f88862bf06391a64c2d44"} Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.977484 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e3d9f2ca39fc1cf1e5a1ae11d91c253e40acb0099f88862bf06391a64c2d44" Mar 20 14:34:04 crc kubenswrapper[4856]: I0320 14:34:04.977539 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-zhgqf" Mar 20 14:34:05 crc kubenswrapper[4856]: I0320 14:34:05.325905 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-wkcpr"] Mar 20 14:34:05 crc kubenswrapper[4856]: I0320 14:34:05.331128 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-wkcpr"] Mar 20 14:34:05 crc kubenswrapper[4856]: I0320 14:34:05.842659 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f81b3d8-e3f8-4519-b669-d20a7371ac14" path="/var/lib/kubelet/pods/3f81b3d8-e3f8-4519-b669-d20a7371ac14/volumes" Mar 20 14:34:35 crc kubenswrapper[4856]: I0320 14:34:35.232592 4856 scope.go:117] "RemoveContainer" containerID="9f50da79dcc154686d49c67a68f6846b029264d59e3aec2d30e8c084252ea40c" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.673923 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:04 crc kubenswrapper[4856]: E0320 14:35:04.674842 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" containerName="oc" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.674937 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" containerName="oc" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.675120 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" containerName="oc" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.676083 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.690917 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.823188 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.823341 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.823520 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cts49\" (UniqueName: \"kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.924378 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.924529 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.924581 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cts49\" (UniqueName: \"kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.925087 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.926423 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.951261 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cts49\" (UniqueName: \"kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49\") pod \"redhat-marketplace-27b4z\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:04 crc kubenswrapper[4856]: I0320 14:35:04.993883 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:05 crc kubenswrapper[4856]: I0320 14:35:05.451887 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:06 crc kubenswrapper[4856]: I0320 14:35:06.434138 4856 generic.go:334] "Generic (PLEG): container finished" podID="352918f2-beb9-48f7-be20-e0578ac448cf" containerID="65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc" exitCode=0 Mar 20 14:35:06 crc kubenswrapper[4856]: I0320 14:35:06.434346 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerDied","Data":"65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc"} Mar 20 14:35:06 crc kubenswrapper[4856]: I0320 14:35:06.434543 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerStarted","Data":"733e69fd237c51e9831eb8a5c9661786320a1743a1b67308c0ae662cfe759f8d"} Mar 20 14:35:06 crc kubenswrapper[4856]: I0320 14:35:06.438836 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:35:07 crc kubenswrapper[4856]: I0320 14:35:07.443602 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerStarted","Data":"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5"} Mar 20 14:35:08 crc kubenswrapper[4856]: I0320 14:35:08.455355 4856 generic.go:334] "Generic (PLEG): container finished" podID="352918f2-beb9-48f7-be20-e0578ac448cf" containerID="c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5" exitCode=0 Mar 20 14:35:08 crc kubenswrapper[4856]: I0320 14:35:08.455434 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerDied","Data":"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5"} Mar 20 14:35:09 crc kubenswrapper[4856]: I0320 14:35:09.465481 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerStarted","Data":"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a"} Mar 20 14:35:09 crc kubenswrapper[4856]: I0320 14:35:09.491489 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-27b4z" podStartSLOduration=2.995601589 podStartE2EDuration="5.491472342s" podCreationTimestamp="2026-03-20 14:35:04 +0000 UTC" firstStartedPulling="2026-03-20 14:35:06.438409646 +0000 UTC m=+4321.319435786" lastFinishedPulling="2026-03-20 14:35:08.934280409 +0000 UTC m=+4323.815306539" observedRunningTime="2026-03-20 14:35:09.487299478 +0000 UTC m=+4324.368325618" watchObservedRunningTime="2026-03-20 14:35:09.491472342 +0000 UTC m=+4324.372498472" Mar 20 14:35:14 crc kubenswrapper[4856]: I0320 14:35:14.994375 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:14 crc kubenswrapper[4856]: I0320 14:35:14.995193 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:15 crc kubenswrapper[4856]: I0320 14:35:15.038247 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:15 crc kubenswrapper[4856]: I0320 14:35:15.562977 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:15 crc kubenswrapper[4856]: I0320 14:35:15.612729 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.520700 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-27b4z" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="registry-server" containerID="cri-o://d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a" gracePeriod=2 Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.917748 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.931008 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities\") pod \"352918f2-beb9-48f7-be20-e0578ac448cf\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.931116 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content\") pod \"352918f2-beb9-48f7-be20-e0578ac448cf\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.931210 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cts49\" (UniqueName: \"kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49\") pod \"352918f2-beb9-48f7-be20-e0578ac448cf\" (UID: \"352918f2-beb9-48f7-be20-e0578ac448cf\") " Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.932531 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities" (OuterVolumeSpecName: "utilities") pod "352918f2-beb9-48f7-be20-e0578ac448cf" (UID: "352918f2-beb9-48f7-be20-e0578ac448cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:35:17 crc kubenswrapper[4856]: I0320 14:35:17.941285 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49" (OuterVolumeSpecName: "kube-api-access-cts49") pod "352918f2-beb9-48f7-be20-e0578ac448cf" (UID: "352918f2-beb9-48f7-be20-e0578ac448cf"). InnerVolumeSpecName "kube-api-access-cts49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.033524 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cts49\" (UniqueName: \"kubernetes.io/projected/352918f2-beb9-48f7-be20-e0578ac448cf-kube-api-access-cts49\") on node \"crc\" DevicePath \"\"" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.033567 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.127242 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "352918f2-beb9-48f7-be20-e0578ac448cf" (UID: "352918f2-beb9-48f7-be20-e0578ac448cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.135580 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/352918f2-beb9-48f7-be20-e0578ac448cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.532746 4856 generic.go:334] "Generic (PLEG): container finished" podID="352918f2-beb9-48f7-be20-e0578ac448cf" containerID="d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a" exitCode=0 Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.532809 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerDied","Data":"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a"} Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.532848 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-27b4z" event={"ID":"352918f2-beb9-48f7-be20-e0578ac448cf","Type":"ContainerDied","Data":"733e69fd237c51e9831eb8a5c9661786320a1743a1b67308c0ae662cfe759f8d"} Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.532843 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-27b4z" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.532870 4856 scope.go:117] "RemoveContainer" containerID="d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.555127 4856 scope.go:117] "RemoveContainer" containerID="c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.579405 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.586042 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-27b4z"] Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.590179 4856 scope.go:117] "RemoveContainer" containerID="65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.627132 4856 scope.go:117] "RemoveContainer" containerID="d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a" Mar 20 14:35:18 crc kubenswrapper[4856]: E0320 14:35:18.627705 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a\": container with ID starting with d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a not found: ID does not exist" containerID="d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.627750 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a"} err="failed to get container status \"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a\": rpc error: code = NotFound desc = could not find container \"d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a\": container with ID starting with d70fea53c9d438c67b0f0b39c9dfc88abd9ebfb356aeb786358f5af62947f91a not found: ID does not exist" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.627781 4856 scope.go:117] "RemoveContainer" containerID="c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5" Mar 20 14:35:18 crc kubenswrapper[4856]: E0320 14:35:18.628240 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5\": container with ID starting with c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5 not found: ID does not exist" containerID="c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.628314 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5"} err="failed to get container status \"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5\": rpc error: code = NotFound desc = could not find container \"c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5\": container with ID starting with c5a474426a603bf660518e54ac9c743d1a65a52ca45d3f11c4b29ab46e805ed5 not found: ID does not exist" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.628349 4856 scope.go:117] "RemoveContainer" containerID="65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc" Mar 20 14:35:18 crc kubenswrapper[4856]: E0320 14:35:18.628711 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc\": container with ID starting with 65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc not found: ID does not exist" containerID="65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc" Mar 20 14:35:18 crc kubenswrapper[4856]: I0320 14:35:18.628758 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc"} err="failed to get container status \"65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc\": rpc error: code = NotFound desc = could not find container \"65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc\": container with ID starting with 65c1113fc10cd2182f17ac984006b0acd65cb838fe9fe79fcd2079bee2d4d9dc not found: ID does not exist" Mar 20 14:35:19 crc kubenswrapper[4856]: I0320 14:35:19.827823 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" path="/var/lib/kubelet/pods/352918f2-beb9-48f7-be20-e0578ac448cf/volumes" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.148745 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566956-zm4tl"] Mar 20 14:36:00 crc kubenswrapper[4856]: E0320 14:36:00.149752 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="registry-server" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.149770 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="registry-server" Mar 20 14:36:00 crc kubenswrapper[4856]: E0320 14:36:00.149793 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="extract-content" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.149798 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="extract-content" Mar 20 14:36:00 crc kubenswrapper[4856]: E0320 14:36:00.149813 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="extract-utilities" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.149821 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="extract-utilities" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.149989 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="352918f2-beb9-48f7-be20-e0578ac448cf" containerName="registry-server" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.150601 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.159233 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.159493 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.161583 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.161721 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5zcb\" (UniqueName: \"kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb\") pod \"auto-csr-approver-29566956-zm4tl\" (UID: \"f53ea005-eae6-4050-99c6-968812c0d0e2\") " pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.166757 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-zm4tl"] Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.263774 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5zcb\" (UniqueName: \"kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb\") pod \"auto-csr-approver-29566956-zm4tl\" (UID: \"f53ea005-eae6-4050-99c6-968812c0d0e2\") " pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.286721 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5zcb\" (UniqueName: \"kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb\") pod \"auto-csr-approver-29566956-zm4tl\" (UID: \"f53ea005-eae6-4050-99c6-968812c0d0e2\") " pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.483866 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:00 crc kubenswrapper[4856]: I0320 14:36:00.901207 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-zm4tl"] Mar 20 14:36:01 crc kubenswrapper[4856]: I0320 14:36:01.897436 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" event={"ID":"f53ea005-eae6-4050-99c6-968812c0d0e2","Type":"ContainerStarted","Data":"4a95fdeebff6a3ea5073cf1ecb47fcb922e3107ec777ba947551878651dbc66c"} Mar 20 14:36:02 crc kubenswrapper[4856]: I0320 14:36:02.906906 4856 generic.go:334] "Generic (PLEG): container finished" podID="f53ea005-eae6-4050-99c6-968812c0d0e2" containerID="741a54ac4f2b396237eb6e8a08de9cdd1664ea21d3cb18af9b7740cba06c85fa" exitCode=0 Mar 20 14:36:02 crc kubenswrapper[4856]: I0320 14:36:02.906951 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" event={"ID":"f53ea005-eae6-4050-99c6-968812c0d0e2","Type":"ContainerDied","Data":"741a54ac4f2b396237eb6e8a08de9cdd1664ea21d3cb18af9b7740cba06c85fa"} Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.194698 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.228491 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5zcb\" (UniqueName: \"kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb\") pod \"f53ea005-eae6-4050-99c6-968812c0d0e2\" (UID: \"f53ea005-eae6-4050-99c6-968812c0d0e2\") " Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.236341 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb" (OuterVolumeSpecName: "kube-api-access-v5zcb") pod "f53ea005-eae6-4050-99c6-968812c0d0e2" (UID: "f53ea005-eae6-4050-99c6-968812c0d0e2"). InnerVolumeSpecName "kube-api-access-v5zcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.330202 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5zcb\" (UniqueName: \"kubernetes.io/projected/f53ea005-eae6-4050-99c6-968812c0d0e2-kube-api-access-v5zcb\") on node \"crc\" DevicePath \"\"" Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.927819 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" event={"ID":"f53ea005-eae6-4050-99c6-968812c0d0e2","Type":"ContainerDied","Data":"4a95fdeebff6a3ea5073cf1ecb47fcb922e3107ec777ba947551878651dbc66c"} Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.927863 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a95fdeebff6a3ea5073cf1ecb47fcb922e3107ec777ba947551878651dbc66c" Mar 20 14:36:04 crc kubenswrapper[4856]: I0320 14:36:04.927880 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-zm4tl" Mar 20 14:36:05 crc kubenswrapper[4856]: I0320 14:36:05.254631 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-wsqbw"] Mar 20 14:36:05 crc kubenswrapper[4856]: I0320 14:36:05.259578 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-wsqbw"] Mar 20 14:36:05 crc kubenswrapper[4856]: I0320 14:36:05.834138 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76260dae-e420-469d-9b2b-97daf5635aa9" path="/var/lib/kubelet/pods/76260dae-e420-469d-9b2b-97daf5635aa9/volumes" Mar 20 14:36:09 crc kubenswrapper[4856]: I0320 14:36:09.987167 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:36:09 crc kubenswrapper[4856]: I0320 14:36:09.987600 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:36:35 crc kubenswrapper[4856]: I0320 14:36:35.327530 4856 scope.go:117] "RemoveContainer" containerID="81606a0f52dab6ebb13e05eab0d532a097d70a81fff24577e555e0fe2ae1b2a9" Mar 20 14:36:39 crc kubenswrapper[4856]: I0320 14:36:39.987454 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:36:39 crc kubenswrapper[4856]: I0320 14:36:39.988005 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:37:09 crc kubenswrapper[4856]: I0320 14:37:09.987920 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:37:09 crc kubenswrapper[4856]: I0320 14:37:09.988850 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:37:09 crc kubenswrapper[4856]: I0320 14:37:09.988939 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:37:09 crc kubenswrapper[4856]: I0320 14:37:09.989911 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:37:09 crc kubenswrapper[4856]: I0320 14:37:09.989993 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d" gracePeriod=600 Mar 20 14:37:10 crc kubenswrapper[4856]: I0320 14:37:10.476837 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d" exitCode=0 Mar 20 14:37:10 crc kubenswrapper[4856]: I0320 14:37:10.476874 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d"} Mar 20 14:37:10 crc kubenswrapper[4856]: I0320 14:37:10.477158 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf"} Mar 20 14:37:10 crc kubenswrapper[4856]: I0320 14:37:10.477182 4856 scope.go:117] "RemoveContainer" containerID="ce9587d698cc24f64f47e9e0d34390a675570f509282fbd850838fca493b3aa1" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.328094 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:25 crc kubenswrapper[4856]: E0320 14:37:25.332602 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53ea005-eae6-4050-99c6-968812c0d0e2" containerName="oc" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.332735 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53ea005-eae6-4050-99c6-968812c0d0e2" containerName="oc" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.333064 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53ea005-eae6-4050-99c6-968812c0d0e2" containerName="oc" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.334812 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.344973 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.454808 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vjt2\" (UniqueName: \"kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.454932 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.454963 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.555892 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vjt2\" (UniqueName: \"kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.555990 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.556020 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.556565 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.556699 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.575592 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vjt2\" (UniqueName: \"kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2\") pod \"community-operators-wq4jw\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:25 crc kubenswrapper[4856]: I0320 14:37:25.657183 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:26 crc kubenswrapper[4856]: I0320 14:37:26.515705 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:26 crc kubenswrapper[4856]: I0320 14:37:26.607808 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerStarted","Data":"9d6e52fb74a233d74890ee91e3cc8c70db120a9879ef09f9be4fdf8cac0e0538"} Mar 20 14:37:27 crc kubenswrapper[4856]: I0320 14:37:27.616227 4856 generic.go:334] "Generic (PLEG): container finished" podID="6ce7f4ec-6231-4747-9335-323de02f423e" containerID="8c402ecd87b0f0e3700678aa682bf077d6c20e8af284e8acade93eaf2d6eb468" exitCode=0 Mar 20 14:37:27 crc kubenswrapper[4856]: I0320 14:37:27.616298 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerDied","Data":"8c402ecd87b0f0e3700678aa682bf077d6c20e8af284e8acade93eaf2d6eb468"} Mar 20 14:37:28 crc kubenswrapper[4856]: I0320 14:37:28.628895 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerStarted","Data":"c210cb97d0430119c4e6fae8133cc46478a510858ab9ddb7026e9890f63fdefa"} Mar 20 14:37:29 crc kubenswrapper[4856]: I0320 14:37:29.640541 4856 generic.go:334] "Generic (PLEG): container finished" podID="6ce7f4ec-6231-4747-9335-323de02f423e" containerID="c210cb97d0430119c4e6fae8133cc46478a510858ab9ddb7026e9890f63fdefa" exitCode=0 Mar 20 14:37:29 crc kubenswrapper[4856]: I0320 14:37:29.640601 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerDied","Data":"c210cb97d0430119c4e6fae8133cc46478a510858ab9ddb7026e9890f63fdefa"} Mar 20 14:37:30 crc kubenswrapper[4856]: I0320 14:37:30.651574 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerStarted","Data":"4b889530d83c87115579a2115b5630c15e100aa9e71fea435d13cceaa034e2e7"} Mar 20 14:37:30 crc kubenswrapper[4856]: I0320 14:37:30.675619 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wq4jw" podStartSLOduration=3.183776643 podStartE2EDuration="5.675599476s" podCreationTimestamp="2026-03-20 14:37:25 +0000 UTC" firstStartedPulling="2026-03-20 14:37:27.618090379 +0000 UTC m=+4462.499116509" lastFinishedPulling="2026-03-20 14:37:30.109913212 +0000 UTC m=+4464.990939342" observedRunningTime="2026-03-20 14:37:30.670871917 +0000 UTC m=+4465.551898057" watchObservedRunningTime="2026-03-20 14:37:30.675599476 +0000 UTC m=+4465.556625606" Mar 20 14:37:35 crc kubenswrapper[4856]: I0320 14:37:35.658351 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:35 crc kubenswrapper[4856]: I0320 14:37:35.658889 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:35 crc kubenswrapper[4856]: I0320 14:37:35.712753 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:35 crc kubenswrapper[4856]: I0320 14:37:35.770616 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:35 crc kubenswrapper[4856]: I0320 14:37:35.947997 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:37 crc kubenswrapper[4856]: I0320 14:37:37.714186 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wq4jw" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="registry-server" containerID="cri-o://4b889530d83c87115579a2115b5630c15e100aa9e71fea435d13cceaa034e2e7" gracePeriod=2 Mar 20 14:37:38 crc kubenswrapper[4856]: I0320 14:37:38.723190 4856 generic.go:334] "Generic (PLEG): container finished" podID="6ce7f4ec-6231-4747-9335-323de02f423e" containerID="4b889530d83c87115579a2115b5630c15e100aa9e71fea435d13cceaa034e2e7" exitCode=0 Mar 20 14:37:38 crc kubenswrapper[4856]: I0320 14:37:38.723229 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerDied","Data":"4b889530d83c87115579a2115b5630c15e100aa9e71fea435d13cceaa034e2e7"} Mar 20 14:37:38 crc kubenswrapper[4856]: I0320 14:37:38.723600 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq4jw" event={"ID":"6ce7f4ec-6231-4747-9335-323de02f423e","Type":"ContainerDied","Data":"9d6e52fb74a233d74890ee91e3cc8c70db120a9879ef09f9be4fdf8cac0e0538"} Mar 20 14:37:38 crc kubenswrapper[4856]: I0320 14:37:38.723619 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d6e52fb74a233d74890ee91e3cc8c70db120a9879ef09f9be4fdf8cac0e0538" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.113484 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.308649 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities\") pod \"6ce7f4ec-6231-4747-9335-323de02f423e\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.309090 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content\") pod \"6ce7f4ec-6231-4747-9335-323de02f423e\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.309098 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities" (OuterVolumeSpecName: "utilities") pod "6ce7f4ec-6231-4747-9335-323de02f423e" (UID: "6ce7f4ec-6231-4747-9335-323de02f423e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.309284 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vjt2\" (UniqueName: \"kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2\") pod \"6ce7f4ec-6231-4747-9335-323de02f423e\" (UID: \"6ce7f4ec-6231-4747-9335-323de02f423e\") " Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.309780 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.314470 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2" (OuterVolumeSpecName: "kube-api-access-7vjt2") pod "6ce7f4ec-6231-4747-9335-323de02f423e" (UID: "6ce7f4ec-6231-4747-9335-323de02f423e"). InnerVolumeSpecName "kube-api-access-7vjt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.369542 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ce7f4ec-6231-4747-9335-323de02f423e" (UID: "6ce7f4ec-6231-4747-9335-323de02f423e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.411139 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vjt2\" (UniqueName: \"kubernetes.io/projected/6ce7f4ec-6231-4747-9335-323de02f423e-kube-api-access-7vjt2\") on node \"crc\" DevicePath \"\"" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.411205 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ce7f4ec-6231-4747-9335-323de02f423e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.732298 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq4jw" Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.772060 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.781553 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wq4jw"] Mar 20 14:37:39 crc kubenswrapper[4856]: I0320 14:37:39.828786 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" path="/var/lib/kubelet/pods/6ce7f4ec-6231-4747-9335-323de02f423e/volumes" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.140622 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566958-qq7b2"] Mar 20 14:38:00 crc kubenswrapper[4856]: E0320 14:38:00.141446 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="extract-content" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.141458 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="extract-content" Mar 20 14:38:00 crc kubenswrapper[4856]: E0320 14:38:00.141471 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="extract-utilities" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.141477 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="extract-utilities" Mar 20 14:38:00 crc kubenswrapper[4856]: E0320 14:38:00.141487 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="registry-server" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.141495 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="registry-server" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.141628 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ce7f4ec-6231-4747-9335-323de02f423e" containerName="registry-server" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.142096 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.146051 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.146155 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.146167 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.151895 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-qq7b2"] Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.223120 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b56z\" (UniqueName: \"kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z\") pod \"auto-csr-approver-29566958-qq7b2\" (UID: \"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371\") " pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.325617 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b56z\" (UniqueName: \"kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z\") pod \"auto-csr-approver-29566958-qq7b2\" (UID: \"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371\") " pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.541706 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b56z\" (UniqueName: \"kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z\") pod \"auto-csr-approver-29566958-qq7b2\" (UID: \"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371\") " pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:00 crc kubenswrapper[4856]: I0320 14:38:00.760383 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:01 crc kubenswrapper[4856]: I0320 14:38:01.166763 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-qq7b2"] Mar 20 14:38:01 crc kubenswrapper[4856]: I0320 14:38:01.902477 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" event={"ID":"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371","Type":"ContainerStarted","Data":"c147c8c7e89a9d7a7e968694796178ac1893f299d868aae823686f3d28e293bf"} Mar 20 14:38:02 crc kubenswrapper[4856]: I0320 14:38:02.912063 4856 generic.go:334] "Generic (PLEG): container finished" podID="21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" containerID="0d9ae4329313a74efcda8912ea6300ba8fcee2e9b7d9c4ab6ed28ad283569e20" exitCode=0 Mar 20 14:38:02 crc kubenswrapper[4856]: I0320 14:38:02.912200 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" event={"ID":"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371","Type":"ContainerDied","Data":"0d9ae4329313a74efcda8912ea6300ba8fcee2e9b7d9c4ab6ed28ad283569e20"} Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.213734 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.379087 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b56z\" (UniqueName: \"kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z\") pod \"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371\" (UID: \"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371\") " Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.385433 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z" (OuterVolumeSpecName: "kube-api-access-8b56z") pod "21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" (UID: "21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371"). InnerVolumeSpecName "kube-api-access-8b56z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.480932 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b56z\" (UniqueName: \"kubernetes.io/projected/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371-kube-api-access-8b56z\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.930072 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" event={"ID":"21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371","Type":"ContainerDied","Data":"c147c8c7e89a9d7a7e968694796178ac1893f299d868aae823686f3d28e293bf"} Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.930111 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c147c8c7e89a9d7a7e968694796178ac1893f299d868aae823686f3d28e293bf" Mar 20 14:38:04 crc kubenswrapper[4856]: I0320 14:38:04.930121 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-qq7b2" Mar 20 14:38:05 crc kubenswrapper[4856]: I0320 14:38:05.280362 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-hqb8s"] Mar 20 14:38:05 crc kubenswrapper[4856]: I0320 14:38:05.289287 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-hqb8s"] Mar 20 14:38:05 crc kubenswrapper[4856]: I0320 14:38:05.829645 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b473a485-f50e-493b-9e17-5cae0eb5c389" path="/var/lib/kubelet/pods/b473a485-f50e-493b-9e17-5cae0eb5c389/volumes" Mar 20 14:38:35 crc kubenswrapper[4856]: I0320 14:38:35.408350 4856 scope.go:117] "RemoveContainer" containerID="71d2b03f9a7471983355cfb8afa218b1cdb7a4c67d9d390faf8ed9c9fd0d1b3c" Mar 20 14:39:39 crc kubenswrapper[4856]: I0320 14:39:39.987175 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:39:39 crc kubenswrapper[4856]: I0320 14:39:39.987768 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.151011 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566960-f7pbd"] Mar 20 14:40:00 crc kubenswrapper[4856]: E0320 14:40:00.152892 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" containerName="oc" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.152945 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" containerName="oc" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.153215 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" containerName="oc" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.154108 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.156898 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.157058 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.157080 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.174586 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-f7pbd"] Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.227242 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjb4\" (UniqueName: \"kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4\") pod \"auto-csr-approver-29566960-f7pbd\" (UID: \"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85\") " pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.328948 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjb4\" (UniqueName: \"kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4\") pod \"auto-csr-approver-29566960-f7pbd\" (UID: \"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85\") " pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.356346 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjb4\" (UniqueName: \"kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4\") pod \"auto-csr-approver-29566960-f7pbd\" (UID: \"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85\") " pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.474919 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:00 crc kubenswrapper[4856]: I0320 14:40:00.887664 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-f7pbd"] Mar 20 14:40:01 crc kubenswrapper[4856]: I0320 14:40:01.780542 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" event={"ID":"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85","Type":"ContainerStarted","Data":"b760b16285df9ea58f0d6ec407681744e5868907c6e4f86e9b18a512a2781b7a"} Mar 20 14:40:02 crc kubenswrapper[4856]: I0320 14:40:02.787735 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" event={"ID":"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85","Type":"ContainerStarted","Data":"d2536b3ae04957af475866a559af964e55d63f8b98feb233d0b298276dd4e8ab"} Mar 20 14:40:02 crc kubenswrapper[4856]: I0320 14:40:02.807511 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" podStartSLOduration=1.266788266 podStartE2EDuration="2.807489815s" podCreationTimestamp="2026-03-20 14:40:00 +0000 UTC" firstStartedPulling="2026-03-20 14:40:00.898380941 +0000 UTC m=+4615.779407071" lastFinishedPulling="2026-03-20 14:40:02.43908249 +0000 UTC m=+4617.320108620" observedRunningTime="2026-03-20 14:40:02.80396634 +0000 UTC m=+4617.684992500" watchObservedRunningTime="2026-03-20 14:40:02.807489815 +0000 UTC m=+4617.688515965" Mar 20 14:40:03 crc kubenswrapper[4856]: I0320 14:40:03.798991 4856 generic.go:334] "Generic (PLEG): container finished" podID="a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" containerID="d2536b3ae04957af475866a559af964e55d63f8b98feb233d0b298276dd4e8ab" exitCode=0 Mar 20 14:40:03 crc kubenswrapper[4856]: I0320 14:40:03.799050 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" event={"ID":"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85","Type":"ContainerDied","Data":"d2536b3ae04957af475866a559af964e55d63f8b98feb233d0b298276dd4e8ab"} Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.089932 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.196826 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfjb4\" (UniqueName: \"kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4\") pod \"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85\" (UID: \"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85\") " Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.201824 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4" (OuterVolumeSpecName: "kube-api-access-dfjb4") pod "a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" (UID: "a1c9f117-237f-4b0a-9bfc-b6d3f6308a85"). InnerVolumeSpecName "kube-api-access-dfjb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.298596 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfjb4\" (UniqueName: \"kubernetes.io/projected/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85-kube-api-access-dfjb4\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.814745 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" event={"ID":"a1c9f117-237f-4b0a-9bfc-b6d3f6308a85","Type":"ContainerDied","Data":"b760b16285df9ea58f0d6ec407681744e5868907c6e4f86e9b18a512a2781b7a"} Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.814800 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b760b16285df9ea58f0d6ec407681744e5868907c6e4f86e9b18a512a2781b7a" Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.814878 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-f7pbd" Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.908300 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-zhgqf"] Mar 20 14:40:05 crc kubenswrapper[4856]: I0320 14:40:05.915817 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-zhgqf"] Mar 20 14:40:07 crc kubenswrapper[4856]: I0320 14:40:07.849771 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c6bef62-76c2-4908-a28e-7c2a2d9f9c43" path="/var/lib/kubelet/pods/5c6bef62-76c2-4908-a28e-7c2a2d9f9c43/volumes" Mar 20 14:40:09 crc kubenswrapper[4856]: I0320 14:40:09.988210 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:40:09 crc kubenswrapper[4856]: I0320 14:40:09.989072 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:40:35 crc kubenswrapper[4856]: I0320 14:40:35.506002 4856 scope.go:117] "RemoveContainer" containerID="49822c57c00fbb79a45bfac30ad0ac460af29d37a1242b9d0581605713a4434e" Mar 20 14:40:39 crc kubenswrapper[4856]: I0320 14:40:39.987650 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:40:39 crc kubenswrapper[4856]: I0320 14:40:39.989368 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:40:39 crc kubenswrapper[4856]: I0320 14:40:39.989576 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:40:39 crc kubenswrapper[4856]: I0320 14:40:39.991064 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:40:39 crc kubenswrapper[4856]: I0320 14:40:39.991578 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" gracePeriod=600 Mar 20 14:40:40 crc kubenswrapper[4856]: E0320 14:40:40.142922 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:40:41 crc kubenswrapper[4856]: I0320 14:40:41.101319 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" exitCode=0 Mar 20 14:40:41 crc kubenswrapper[4856]: I0320 14:40:41.101373 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf"} Mar 20 14:40:41 crc kubenswrapper[4856]: I0320 14:40:41.101416 4856 scope.go:117] "RemoveContainer" containerID="1569ac6a1cdace58a20d872e8d6c1e7d68c17218654dde3ceed184d890d4a98d" Mar 20 14:40:41 crc kubenswrapper[4856]: I0320 14:40:41.102200 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:40:41 crc kubenswrapper[4856]: E0320 14:40:41.102571 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:40:51 crc kubenswrapper[4856]: I0320 14:40:51.820085 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:40:51 crc kubenswrapper[4856]: E0320 14:40:51.821038 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:03 crc kubenswrapper[4856]: I0320 14:41:03.820367 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:41:03 crc kubenswrapper[4856]: E0320 14:41:03.821157 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:18 crc kubenswrapper[4856]: I0320 14:41:18.820211 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:41:18 crc kubenswrapper[4856]: E0320 14:41:18.820898 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:32 crc kubenswrapper[4856]: I0320 14:41:32.819918 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:41:32 crc kubenswrapper[4856]: E0320 14:41:32.820587 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:43 crc kubenswrapper[4856]: I0320 14:41:43.911247 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:43 crc kubenswrapper[4856]: E0320 14:41:43.912034 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" containerName="oc" Mar 20 14:41:43 crc kubenswrapper[4856]: I0320 14:41:43.912048 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" containerName="oc" Mar 20 14:41:43 crc kubenswrapper[4856]: I0320 14:41:43.912243 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" containerName="oc" Mar 20 14:41:43 crc kubenswrapper[4856]: I0320 14:41:43.913172 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:43 crc kubenswrapper[4856]: I0320 14:41:43.930570 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.067587 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.067669 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.067759 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb24z\" (UniqueName: \"kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.169515 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.169602 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xb24z\" (UniqueName: \"kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.169665 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.170118 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.170123 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.197153 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb24z\" (UniqueName: \"kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z\") pod \"certified-operators-gglmd\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.234001 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.671307 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:44 crc kubenswrapper[4856]: I0320 14:41:44.820345 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:41:44 crc kubenswrapper[4856]: E0320 14:41:44.821079 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:45 crc kubenswrapper[4856]: I0320 14:41:45.657887 4856 generic.go:334] "Generic (PLEG): container finished" podID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerID="3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2" exitCode=0 Mar 20 14:41:45 crc kubenswrapper[4856]: I0320 14:41:45.657972 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerDied","Data":"3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2"} Mar 20 14:41:45 crc kubenswrapper[4856]: I0320 14:41:45.658245 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerStarted","Data":"a2dc18e85515682af3dd28ce22f53f57166f958565e877d5c7ecffedc0d1f26d"} Mar 20 14:41:45 crc kubenswrapper[4856]: I0320 14:41:45.660354 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:41:46 crc kubenswrapper[4856]: I0320 14:41:46.666961 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerStarted","Data":"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d"} Mar 20 14:41:47 crc kubenswrapper[4856]: I0320 14:41:47.684609 4856 generic.go:334] "Generic (PLEG): container finished" podID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerID="8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d" exitCode=0 Mar 20 14:41:47 crc kubenswrapper[4856]: I0320 14:41:47.684676 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerDied","Data":"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d"} Mar 20 14:41:47 crc kubenswrapper[4856]: I0320 14:41:47.895283 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-n8c5c"] Mar 20 14:41:47 crc kubenswrapper[4856]: I0320 14:41:47.902169 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-n8c5c"] Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.018657 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9zvt5"] Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.019767 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.021456 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.021610 4856 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6fh9x" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.022156 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.025464 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9zvt5"] Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.026838 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.128550 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.128620 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x74rj\" (UniqueName: \"kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.128786 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.230191 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.230315 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.230348 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x74rj\" (UniqueName: \"kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.231134 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.231484 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.251005 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x74rj\" (UniqueName: \"kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj\") pod \"crc-storage-crc-9zvt5\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.340976 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:48 crc kubenswrapper[4856]: W0320 14:41:48.552594 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34ab98fa_3164_4d31_a65a_064face21cd0.slice/crio-be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d WatchSource:0}: Error finding container be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d: Status 404 returned error can't find the container with id be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.555188 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9zvt5"] Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.700695 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9zvt5" event={"ID":"34ab98fa-3164-4d31-a65a-064face21cd0","Type":"ContainerStarted","Data":"be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d"} Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.702989 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerStarted","Data":"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede"} Mar 20 14:41:48 crc kubenswrapper[4856]: I0320 14:41:48.721255 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gglmd" podStartSLOduration=3.219690895 podStartE2EDuration="5.721230393s" podCreationTimestamp="2026-03-20 14:41:43 +0000 UTC" firstStartedPulling="2026-03-20 14:41:45.660020225 +0000 UTC m=+4720.541046355" lastFinishedPulling="2026-03-20 14:41:48.161559723 +0000 UTC m=+4723.042585853" observedRunningTime="2026-03-20 14:41:48.71892658 +0000 UTC m=+4723.599952700" watchObservedRunningTime="2026-03-20 14:41:48.721230393 +0000 UTC m=+4723.602256523" Mar 20 14:41:49 crc kubenswrapper[4856]: I0320 14:41:49.712859 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9zvt5" event={"ID":"34ab98fa-3164-4d31-a65a-064face21cd0","Type":"ContainerStarted","Data":"5df7da3b0674ae17f8322d726c908dc9845e14e6ce2dbb7a9710f477d78967b2"} Mar 20 14:41:49 crc kubenswrapper[4856]: I0320 14:41:49.736684 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="crc-storage/crc-storage-crc-9zvt5" podStartSLOduration=0.870079291 podStartE2EDuration="1.736663509s" podCreationTimestamp="2026-03-20 14:41:48 +0000 UTC" firstStartedPulling="2026-03-20 14:41:48.560270874 +0000 UTC m=+4723.441297004" lastFinishedPulling="2026-03-20 14:41:49.426855092 +0000 UTC m=+4724.307881222" observedRunningTime="2026-03-20 14:41:49.732516397 +0000 UTC m=+4724.613542537" watchObservedRunningTime="2026-03-20 14:41:49.736663509 +0000 UTC m=+4724.617689639" Mar 20 14:41:49 crc kubenswrapper[4856]: I0320 14:41:49.828910 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7" path="/var/lib/kubelet/pods/d50d0d3b-ef22-4f84-9d11-e648ccc4b7e7/volumes" Mar 20 14:41:50 crc kubenswrapper[4856]: I0320 14:41:50.721823 4856 generic.go:334] "Generic (PLEG): container finished" podID="34ab98fa-3164-4d31-a65a-064face21cd0" containerID="5df7da3b0674ae17f8322d726c908dc9845e14e6ce2dbb7a9710f477d78967b2" exitCode=0 Mar 20 14:41:50 crc kubenswrapper[4856]: I0320 14:41:50.721894 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9zvt5" event={"ID":"34ab98fa-3164-4d31-a65a-064face21cd0","Type":"ContainerDied","Data":"5df7da3b0674ae17f8322d726c908dc9845e14e6ce2dbb7a9710f477d78967b2"} Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.007678 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.086886 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt\") pod \"34ab98fa-3164-4d31-a65a-064face21cd0\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.086972 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x74rj\" (UniqueName: \"kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj\") pod \"34ab98fa-3164-4d31-a65a-064face21cd0\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.086983 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "34ab98fa-3164-4d31-a65a-064face21cd0" (UID: "34ab98fa-3164-4d31-a65a-064face21cd0"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.087056 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage\") pod \"34ab98fa-3164-4d31-a65a-064face21cd0\" (UID: \"34ab98fa-3164-4d31-a65a-064face21cd0\") " Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.087336 4856 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/34ab98fa-3164-4d31-a65a-064face21cd0-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.092943 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj" (OuterVolumeSpecName: "kube-api-access-x74rj") pod "34ab98fa-3164-4d31-a65a-064face21cd0" (UID: "34ab98fa-3164-4d31-a65a-064face21cd0"). InnerVolumeSpecName "kube-api-access-x74rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.110732 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "34ab98fa-3164-4d31-a65a-064face21cd0" (UID: "34ab98fa-3164-4d31-a65a-064face21cd0"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.188127 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x74rj\" (UniqueName: \"kubernetes.io/projected/34ab98fa-3164-4d31-a65a-064face21cd0-kube-api-access-x74rj\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.188169 4856 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/34ab98fa-3164-4d31-a65a-064face21cd0-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.740146 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9zvt5" event={"ID":"34ab98fa-3164-4d31-a65a-064face21cd0","Type":"ContainerDied","Data":"be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d"} Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.740187 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0f4f20122f789cd99e2143320a4935ddb31b37e0e0983b533076025386ef5d" Mar 20 14:41:52 crc kubenswrapper[4856]: I0320 14:41:52.740216 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9zvt5" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.191432 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-9zvt5"] Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.196247 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-9zvt5"] Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.234370 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.234670 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.279872 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.342244 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9wcc9"] Mar 20 14:41:54 crc kubenswrapper[4856]: E0320 14:41:54.342531 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34ab98fa-3164-4d31-a65a-064face21cd0" containerName="storage" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.342548 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="34ab98fa-3164-4d31-a65a-064face21cd0" containerName="storage" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.342707 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="34ab98fa-3164-4d31-a65a-064face21cd0" containerName="storage" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.343152 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.345817 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.345907 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.346113 4856 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6fh9x" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.346470 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.351866 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9wcc9"] Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.418392 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7dtx\" (UniqueName: \"kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.418716 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.418838 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.520300 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7dtx\" (UniqueName: \"kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.520364 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.520409 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.520641 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.521145 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.539866 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7dtx\" (UniqueName: \"kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx\") pod \"crc-storage-crc-9wcc9\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.659400 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.806240 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:54 crc kubenswrapper[4856]: I0320 14:41:54.858173 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:55 crc kubenswrapper[4856]: I0320 14:41:55.083116 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9wcc9"] Mar 20 14:41:55 crc kubenswrapper[4856]: I0320 14:41:55.765151 4856 generic.go:334] "Generic (PLEG): container finished" podID="fe5c613c-2602-45c6-a8b7-121ccdeca042" containerID="0dea413998ae3b05856e2c5ce4604c979eecd81db5a453916c35e33da7bf80e0" exitCode=0 Mar 20 14:41:55 crc kubenswrapper[4856]: I0320 14:41:55.765233 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wcc9" event={"ID":"fe5c613c-2602-45c6-a8b7-121ccdeca042","Type":"ContainerDied","Data":"0dea413998ae3b05856e2c5ce4604c979eecd81db5a453916c35e33da7bf80e0"} Mar 20 14:41:55 crc kubenswrapper[4856]: I0320 14:41:55.765596 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wcc9" event={"ID":"fe5c613c-2602-45c6-a8b7-121ccdeca042","Type":"ContainerStarted","Data":"275a6111f444afe5a9d7db9911438288827a08f36519269953f81b62a824fb17"} Mar 20 14:41:55 crc kubenswrapper[4856]: I0320 14:41:55.828560 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34ab98fa-3164-4d31-a65a-064face21cd0" path="/var/lib/kubelet/pods/34ab98fa-3164-4d31-a65a-064face21cd0/volumes" Mar 20 14:41:56 crc kubenswrapper[4856]: I0320 14:41:56.772561 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gglmd" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="registry-server" containerID="cri-o://a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede" gracePeriod=2 Mar 20 14:41:56 crc kubenswrapper[4856]: I0320 14:41:56.819533 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:41:56 crc kubenswrapper[4856]: E0320 14:41:56.819872 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.037326 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.159187 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt\") pod \"fe5c613c-2602-45c6-a8b7-121ccdeca042\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.159352 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "fe5c613c-2602-45c6-a8b7-121ccdeca042" (UID: "fe5c613c-2602-45c6-a8b7-121ccdeca042"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.159388 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7dtx\" (UniqueName: \"kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx\") pod \"fe5c613c-2602-45c6-a8b7-121ccdeca042\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.159488 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage\") pod \"fe5c613c-2602-45c6-a8b7-121ccdeca042\" (UID: \"fe5c613c-2602-45c6-a8b7-121ccdeca042\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.159790 4856 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/fe5c613c-2602-45c6-a8b7-121ccdeca042-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.341192 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx" (OuterVolumeSpecName: "kube-api-access-l7dtx") pod "fe5c613c-2602-45c6-a8b7-121ccdeca042" (UID: "fe5c613c-2602-45c6-a8b7-121ccdeca042"). InnerVolumeSpecName "kube-api-access-l7dtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.354119 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "fe5c613c-2602-45c6-a8b7-121ccdeca042" (UID: "fe5c613c-2602-45c6-a8b7-121ccdeca042"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.362512 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7dtx\" (UniqueName: \"kubernetes.io/projected/fe5c613c-2602-45c6-a8b7-121ccdeca042-kube-api-access-l7dtx\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.362546 4856 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/fe5c613c-2602-45c6-a8b7-121ccdeca042-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.371732 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.463431 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content\") pod \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.463515 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities\") pod \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.463552 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb24z\" (UniqueName: \"kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z\") pod \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\" (UID: \"08fedc72-dab1-459e-9bd7-0290c7f66e4d\") " Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.465423 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities" (OuterVolumeSpecName: "utilities") pod "08fedc72-dab1-459e-9bd7-0290c7f66e4d" (UID: "08fedc72-dab1-459e-9bd7-0290c7f66e4d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.538826 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08fedc72-dab1-459e-9bd7-0290c7f66e4d" (UID: "08fedc72-dab1-459e-9bd7-0290c7f66e4d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.539065 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z" (OuterVolumeSpecName: "kube-api-access-xb24z") pod "08fedc72-dab1-459e-9bd7-0290c7f66e4d" (UID: "08fedc72-dab1-459e-9bd7-0290c7f66e4d"). InnerVolumeSpecName "kube-api-access-xb24z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.564544 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.564782 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08fedc72-dab1-459e-9bd7-0290c7f66e4d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.564851 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xb24z\" (UniqueName: \"kubernetes.io/projected/08fedc72-dab1-459e-9bd7-0290c7f66e4d-kube-api-access-xb24z\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.782021 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9wcc9" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.782015 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9wcc9" event={"ID":"fe5c613c-2602-45c6-a8b7-121ccdeca042","Type":"ContainerDied","Data":"275a6111f444afe5a9d7db9911438288827a08f36519269953f81b62a824fb17"} Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.782261 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="275a6111f444afe5a9d7db9911438288827a08f36519269953f81b62a824fb17" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.784588 4856 generic.go:334] "Generic (PLEG): container finished" podID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerID="a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede" exitCode=0 Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.784799 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerDied","Data":"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede"} Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.784859 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gglmd" event={"ID":"08fedc72-dab1-459e-9bd7-0290c7f66e4d","Type":"ContainerDied","Data":"a2dc18e85515682af3dd28ce22f53f57166f958565e877d5c7ecffedc0d1f26d"} Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.784892 4856 scope.go:117] "RemoveContainer" containerID="a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.784790 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gglmd" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.807147 4856 scope.go:117] "RemoveContainer" containerID="8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.835684 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.842177 4856 scope.go:117] "RemoveContainer" containerID="3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.843456 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gglmd"] Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.949183 4856 scope.go:117] "RemoveContainer" containerID="a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede" Mar 20 14:41:57 crc kubenswrapper[4856]: E0320 14:41:57.950054 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede\": container with ID starting with a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede not found: ID does not exist" containerID="a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.950219 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede"} err="failed to get container status \"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede\": rpc error: code = NotFound desc = could not find container \"a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede\": container with ID starting with a75e0be05c94d9b6e21fab96132a99fad376ef55aa24d037e5293521ad38eede not found: ID does not exist" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.950403 4856 scope.go:117] "RemoveContainer" containerID="8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d" Mar 20 14:41:57 crc kubenswrapper[4856]: E0320 14:41:57.950745 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d\": container with ID starting with 8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d not found: ID does not exist" containerID="8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.950846 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d"} err="failed to get container status \"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d\": rpc error: code = NotFound desc = could not find container \"8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d\": container with ID starting with 8c92fd3156bdc8bb745c379cdfdc2c8d330ce3a117350a739a1646b5a6a4e60d not found: ID does not exist" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.950930 4856 scope.go:117] "RemoveContainer" containerID="3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2" Mar 20 14:41:57 crc kubenswrapper[4856]: E0320 14:41:57.951303 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2\": container with ID starting with 3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2 not found: ID does not exist" containerID="3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2" Mar 20 14:41:57 crc kubenswrapper[4856]: I0320 14:41:57.951758 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2"} err="failed to get container status \"3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2\": rpc error: code = NotFound desc = could not find container \"3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2\": container with ID starting with 3d309582b4b1f5a0ac949f7b34a23b8b8e33db3a33708c04ed4cc6c0190110f2 not found: ID does not exist" Mar 20 14:41:59 crc kubenswrapper[4856]: I0320 14:41:59.827346 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" path="/var/lib/kubelet/pods/08fedc72-dab1-459e-9bd7-0290c7f66e4d/volumes" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146337 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566962-l7pxc"] Mar 20 14:42:00 crc kubenswrapper[4856]: E0320 14:42:00.146662 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146685 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4856]: E0320 14:42:00.146708 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5c613c-2602-45c6-a8b7-121ccdeca042" containerName="storage" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146718 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5c613c-2602-45c6-a8b7-121ccdeca042" containerName="storage" Mar 20 14:42:00 crc kubenswrapper[4856]: E0320 14:42:00.146751 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146762 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4856]: E0320 14:42:00.146781 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146789 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146951 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5c613c-2602-45c6-a8b7-121ccdeca042" containerName="storage" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.146965 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fedc72-dab1-459e-9bd7-0290c7f66e4d" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.147513 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.149517 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.150465 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.150521 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.158590 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-l7pxc"] Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.203946 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrzt\" (UniqueName: \"kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt\") pod \"auto-csr-approver-29566962-l7pxc\" (UID: \"a681a7b2-6f2b-4a71-b035-5fb78e206899\") " pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.305126 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrzt\" (UniqueName: \"kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt\") pod \"auto-csr-approver-29566962-l7pxc\" (UID: \"a681a7b2-6f2b-4a71-b035-5fb78e206899\") " pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.323477 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrzt\" (UniqueName: \"kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt\") pod \"auto-csr-approver-29566962-l7pxc\" (UID: \"a681a7b2-6f2b-4a71-b035-5fb78e206899\") " pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.474814 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:00 crc kubenswrapper[4856]: I0320 14:42:00.892045 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-l7pxc"] Mar 20 14:42:00 crc kubenswrapper[4856]: W0320 14:42:00.898163 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda681a7b2_6f2b_4a71_b035_5fb78e206899.slice/crio-101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3 WatchSource:0}: Error finding container 101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3: Status 404 returned error can't find the container with id 101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3 Mar 20 14:42:01 crc kubenswrapper[4856]: I0320 14:42:01.815406 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" event={"ID":"a681a7b2-6f2b-4a71-b035-5fb78e206899","Type":"ContainerStarted","Data":"101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3"} Mar 20 14:42:02 crc kubenswrapper[4856]: I0320 14:42:02.822742 4856 generic.go:334] "Generic (PLEG): container finished" podID="a681a7b2-6f2b-4a71-b035-5fb78e206899" containerID="7762a226700abce90a407e2c9dc2415e34aa87b269c663a8dec2e5d3a6500e18" exitCode=0 Mar 20 14:42:02 crc kubenswrapper[4856]: I0320 14:42:02.822797 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" event={"ID":"a681a7b2-6f2b-4a71-b035-5fb78e206899","Type":"ContainerDied","Data":"7762a226700abce90a407e2c9dc2415e34aa87b269c663a8dec2e5d3a6500e18"} Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.091826 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.161130 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wrzt\" (UniqueName: \"kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt\") pod \"a681a7b2-6f2b-4a71-b035-5fb78e206899\" (UID: \"a681a7b2-6f2b-4a71-b035-5fb78e206899\") " Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.166841 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt" (OuterVolumeSpecName: "kube-api-access-7wrzt") pod "a681a7b2-6f2b-4a71-b035-5fb78e206899" (UID: "a681a7b2-6f2b-4a71-b035-5fb78e206899"). InnerVolumeSpecName "kube-api-access-7wrzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.262894 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wrzt\" (UniqueName: \"kubernetes.io/projected/a681a7b2-6f2b-4a71-b035-5fb78e206899-kube-api-access-7wrzt\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.840028 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" event={"ID":"a681a7b2-6f2b-4a71-b035-5fb78e206899","Type":"ContainerDied","Data":"101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3"} Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.840071 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101f933db7a482eb1763bf88ec9ab31b6e883a8e4ab262ed6a2c08f2bc19c3d3" Mar 20 14:42:04 crc kubenswrapper[4856]: I0320 14:42:04.840135 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-l7pxc" Mar 20 14:42:05 crc kubenswrapper[4856]: I0320 14:42:05.150298 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-zm4tl"] Mar 20 14:42:05 crc kubenswrapper[4856]: I0320 14:42:05.154998 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-zm4tl"] Mar 20 14:42:05 crc kubenswrapper[4856]: I0320 14:42:05.839161 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53ea005-eae6-4050-99c6-968812c0d0e2" path="/var/lib/kubelet/pods/f53ea005-eae6-4050-99c6-968812c0d0e2/volumes" Mar 20 14:42:08 crc kubenswrapper[4856]: I0320 14:42:08.820525 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:42:08 crc kubenswrapper[4856]: E0320 14:42:08.821132 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:42:22 crc kubenswrapper[4856]: I0320 14:42:22.820179 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:42:22 crc kubenswrapper[4856]: E0320 14:42:22.820966 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:42:35 crc kubenswrapper[4856]: I0320 14:42:35.599786 4856 scope.go:117] "RemoveContainer" containerID="741a54ac4f2b396237eb6e8a08de9cdd1664ea21d3cb18af9b7740cba06c85fa" Mar 20 14:42:35 crc kubenswrapper[4856]: I0320 14:42:35.651544 4856 scope.go:117] "RemoveContainer" containerID="2c182ab18c50fa8f22b8c3303f1801993710fd7b51c6d0c33910504a413a9c00" Mar 20 14:42:37 crc kubenswrapper[4856]: I0320 14:42:37.820112 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:42:37 crc kubenswrapper[4856]: E0320 14:42:37.820333 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:42:52 crc kubenswrapper[4856]: I0320 14:42:52.819743 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:42:52 crc kubenswrapper[4856]: E0320 14:42:52.821739 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.305421 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:03 crc kubenswrapper[4856]: E0320 14:43:03.306426 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a681a7b2-6f2b-4a71-b035-5fb78e206899" containerName="oc" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.306445 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="a681a7b2-6f2b-4a71-b035-5fb78e206899" containerName="oc" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.306634 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="a681a7b2-6f2b-4a71-b035-5fb78e206899" containerName="oc" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.307897 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.311806 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.423644 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.423713 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxgwf\" (UniqueName: \"kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.423737 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.525564 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.525687 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.525746 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxgwf\" (UniqueName: \"kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.526362 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.526382 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.548794 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxgwf\" (UniqueName: \"kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf\") pod \"redhat-operators-8q5c5\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:03 crc kubenswrapper[4856]: I0320 14:43:03.627392 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:04 crc kubenswrapper[4856]: I0320 14:43:04.078572 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:04 crc kubenswrapper[4856]: W0320 14:43:04.092771 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453d4f02_3f02_4aa5_8695_3c9d67c49110.slice/crio-9ba683d03712041e4fcdb7cfbf942bea8a83a1471a6a67b69c6f816b65c48859 WatchSource:0}: Error finding container 9ba683d03712041e4fcdb7cfbf942bea8a83a1471a6a67b69c6f816b65c48859: Status 404 returned error can't find the container with id 9ba683d03712041e4fcdb7cfbf942bea8a83a1471a6a67b69c6f816b65c48859 Mar 20 14:43:04 crc kubenswrapper[4856]: I0320 14:43:04.308434 4856 generic.go:334] "Generic (PLEG): container finished" podID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerID="1c21c47c908511dc4a29d28adbc7d0911f97e537684f2bc5ef51fb6cdd52345c" exitCode=0 Mar 20 14:43:04 crc kubenswrapper[4856]: I0320 14:43:04.309239 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerDied","Data":"1c21c47c908511dc4a29d28adbc7d0911f97e537684f2bc5ef51fb6cdd52345c"} Mar 20 14:43:04 crc kubenswrapper[4856]: I0320 14:43:04.309388 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerStarted","Data":"9ba683d03712041e4fcdb7cfbf942bea8a83a1471a6a67b69c6f816b65c48859"} Mar 20 14:43:05 crc kubenswrapper[4856]: I0320 14:43:05.318029 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerStarted","Data":"f2a5385c4ac96906011088b8cad64153080c5c15e4d396f7b9544c301b854bde"} Mar 20 14:43:06 crc kubenswrapper[4856]: I0320 14:43:06.326337 4856 generic.go:334] "Generic (PLEG): container finished" podID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerID="f2a5385c4ac96906011088b8cad64153080c5c15e4d396f7b9544c301b854bde" exitCode=0 Mar 20 14:43:06 crc kubenswrapper[4856]: I0320 14:43:06.326425 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerDied","Data":"f2a5385c4ac96906011088b8cad64153080c5c15e4d396f7b9544c301b854bde"} Mar 20 14:43:06 crc kubenswrapper[4856]: I0320 14:43:06.820221 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:43:06 crc kubenswrapper[4856]: E0320 14:43:06.820715 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:43:07 crc kubenswrapper[4856]: I0320 14:43:07.342258 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerStarted","Data":"1735247b51ea6152efdd67a839d9495546da5f5a21c8455bbc0ef0ee5af0533f"} Mar 20 14:43:07 crc kubenswrapper[4856]: I0320 14:43:07.371703 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8q5c5" podStartSLOduration=1.86120193 podStartE2EDuration="4.371684962s" podCreationTimestamp="2026-03-20 14:43:03 +0000 UTC" firstStartedPulling="2026-03-20 14:43:04.310406202 +0000 UTC m=+4799.191432332" lastFinishedPulling="2026-03-20 14:43:06.820889224 +0000 UTC m=+4801.701915364" observedRunningTime="2026-03-20 14:43:07.367336293 +0000 UTC m=+4802.248362453" watchObservedRunningTime="2026-03-20 14:43:07.371684962 +0000 UTC m=+4802.252711092" Mar 20 14:43:13 crc kubenswrapper[4856]: I0320 14:43:13.628222 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:13 crc kubenswrapper[4856]: I0320 14:43:13.628866 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:13 crc kubenswrapper[4856]: I0320 14:43:13.672643 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:14 crc kubenswrapper[4856]: I0320 14:43:14.523901 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:14 crc kubenswrapper[4856]: I0320 14:43:14.569987 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:16 crc kubenswrapper[4856]: I0320 14:43:16.404227 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8q5c5" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="registry-server" containerID="cri-o://1735247b51ea6152efdd67a839d9495546da5f5a21c8455bbc0ef0ee5af0533f" gracePeriod=2 Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.424070 4856 generic.go:334] "Generic (PLEG): container finished" podID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerID="1735247b51ea6152efdd67a839d9495546da5f5a21c8455bbc0ef0ee5af0533f" exitCode=0 Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.424398 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerDied","Data":"1735247b51ea6152efdd67a839d9495546da5f5a21c8455bbc0ef0ee5af0533f"} Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.615198 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.648878 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxgwf\" (UniqueName: \"kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf\") pod \"453d4f02-3f02-4aa5-8695-3c9d67c49110\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.648944 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities\") pod \"453d4f02-3f02-4aa5-8695-3c9d67c49110\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.648968 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content\") pod \"453d4f02-3f02-4aa5-8695-3c9d67c49110\" (UID: \"453d4f02-3f02-4aa5-8695-3c9d67c49110\") " Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.650249 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities" (OuterVolumeSpecName: "utilities") pod "453d4f02-3f02-4aa5-8695-3c9d67c49110" (UID: "453d4f02-3f02-4aa5-8695-3c9d67c49110"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.660996 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf" (OuterVolumeSpecName: "kube-api-access-bxgwf") pod "453d4f02-3f02-4aa5-8695-3c9d67c49110" (UID: "453d4f02-3f02-4aa5-8695-3c9d67c49110"). InnerVolumeSpecName "kube-api-access-bxgwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.750537 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxgwf\" (UniqueName: \"kubernetes.io/projected/453d4f02-3f02-4aa5-8695-3c9d67c49110-kube-api-access-bxgwf\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.750762 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.796209 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "453d4f02-3f02-4aa5-8695-3c9d67c49110" (UID: "453d4f02-3f02-4aa5-8695-3c9d67c49110"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:43:18 crc kubenswrapper[4856]: I0320 14:43:18.852913 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/453d4f02-3f02-4aa5-8695-3c9d67c49110-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.433643 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8q5c5" event={"ID":"453d4f02-3f02-4aa5-8695-3c9d67c49110","Type":"ContainerDied","Data":"9ba683d03712041e4fcdb7cfbf942bea8a83a1471a6a67b69c6f816b65c48859"} Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.433720 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8q5c5" Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.433727 4856 scope.go:117] "RemoveContainer" containerID="1735247b51ea6152efdd67a839d9495546da5f5a21c8455bbc0ef0ee5af0533f" Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.469229 4856 scope.go:117] "RemoveContainer" containerID="f2a5385c4ac96906011088b8cad64153080c5c15e4d396f7b9544c301b854bde" Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.479801 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.488589 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8q5c5"] Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.516191 4856 scope.go:117] "RemoveContainer" containerID="1c21c47c908511dc4a29d28adbc7d0911f97e537684f2bc5ef51fb6cdd52345c" Mar 20 14:43:19 crc kubenswrapper[4856]: I0320 14:43:19.827667 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" path="/var/lib/kubelet/pods/453d4f02-3f02-4aa5-8695-3c9d67c49110/volumes" Mar 20 14:43:21 crc kubenswrapper[4856]: I0320 14:43:21.820324 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:43:21 crc kubenswrapper[4856]: E0320 14:43:21.821582 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:43:34 crc kubenswrapper[4856]: I0320 14:43:34.820127 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:43:34 crc kubenswrapper[4856]: E0320 14:43:34.820932 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:43:35 crc kubenswrapper[4856]: I0320 14:43:35.749929 4856 scope.go:117] "RemoveContainer" containerID="c210cb97d0430119c4e6fae8133cc46478a510858ab9ddb7026e9890f63fdefa" Mar 20 14:43:35 crc kubenswrapper[4856]: I0320 14:43:35.774462 4856 scope.go:117] "RemoveContainer" containerID="4b889530d83c87115579a2115b5630c15e100aa9e71fea435d13cceaa034e2e7" Mar 20 14:43:35 crc kubenswrapper[4856]: I0320 14:43:35.794880 4856 scope.go:117] "RemoveContainer" containerID="8c402ecd87b0f0e3700678aa682bf077d6c20e8af284e8acade93eaf2d6eb468" Mar 20 14:43:47 crc kubenswrapper[4856]: I0320 14:43:47.820304 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:43:47 crc kubenswrapper[4856]: E0320 14:43:47.821189 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.138080 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566964-wxcp9"] Mar 20 14:44:00 crc kubenswrapper[4856]: E0320 14:44:00.138865 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.138876 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4856]: E0320 14:44:00.138891 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="extract-content" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.138896 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="extract-content" Mar 20 14:44:00 crc kubenswrapper[4856]: E0320 14:44:00.138910 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="extract-utilities" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.138916 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="extract-utilities" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.139039 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="453d4f02-3f02-4aa5-8695-3c9d67c49110" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.139617 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.142213 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.143046 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.144486 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.147879 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-wxcp9"] Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.155819 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cssvj\" (UniqueName: \"kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj\") pod \"auto-csr-approver-29566964-wxcp9\" (UID: \"66ce2de7-9712-4031-b5c8-edbfe5c775cf\") " pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.257686 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cssvj\" (UniqueName: \"kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj\") pod \"auto-csr-approver-29566964-wxcp9\" (UID: \"66ce2de7-9712-4031-b5c8-edbfe5c775cf\") " pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.282431 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cssvj\" (UniqueName: \"kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj\") pod \"auto-csr-approver-29566964-wxcp9\" (UID: \"66ce2de7-9712-4031-b5c8-edbfe5c775cf\") " pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.459591 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.668447 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-wxcp9"] Mar 20 14:44:00 crc kubenswrapper[4856]: I0320 14:44:00.722376 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" event={"ID":"66ce2de7-9712-4031-b5c8-edbfe5c775cf","Type":"ContainerStarted","Data":"f15d930b1aa052e0a062223e19f273a5188fb520800672f1803f4a16d4a1ab0b"} Mar 20 14:44:02 crc kubenswrapper[4856]: I0320 14:44:02.749012 4856 generic.go:334] "Generic (PLEG): container finished" podID="66ce2de7-9712-4031-b5c8-edbfe5c775cf" containerID="6d44629c872b7b54ea08f37404c1e282ada32c159a4d01ae8818a9aa08200882" exitCode=0 Mar 20 14:44:02 crc kubenswrapper[4856]: I0320 14:44:02.749061 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" event={"ID":"66ce2de7-9712-4031-b5c8-edbfe5c775cf","Type":"ContainerDied","Data":"6d44629c872b7b54ea08f37404c1e282ada32c159a4d01ae8818a9aa08200882"} Mar 20 14:44:02 crc kubenswrapper[4856]: I0320 14:44:02.819566 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:44:02 crc kubenswrapper[4856]: E0320 14:44:02.819767 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.023628 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.215306 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cssvj\" (UniqueName: \"kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj\") pod \"66ce2de7-9712-4031-b5c8-edbfe5c775cf\" (UID: \"66ce2de7-9712-4031-b5c8-edbfe5c775cf\") " Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.221881 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj" (OuterVolumeSpecName: "kube-api-access-cssvj") pod "66ce2de7-9712-4031-b5c8-edbfe5c775cf" (UID: "66ce2de7-9712-4031-b5c8-edbfe5c775cf"). InnerVolumeSpecName "kube-api-access-cssvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.317195 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cssvj\" (UniqueName: \"kubernetes.io/projected/66ce2de7-9712-4031-b5c8-edbfe5c775cf-kube-api-access-cssvj\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.763163 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" event={"ID":"66ce2de7-9712-4031-b5c8-edbfe5c775cf","Type":"ContainerDied","Data":"f15d930b1aa052e0a062223e19f273a5188fb520800672f1803f4a16d4a1ab0b"} Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.763222 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f15d930b1aa052e0a062223e19f273a5188fb520800672f1803f4a16d4a1ab0b" Mar 20 14:44:04 crc kubenswrapper[4856]: I0320 14:44:04.763306 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-wxcp9" Mar 20 14:44:05 crc kubenswrapper[4856]: I0320 14:44:05.086476 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-qq7b2"] Mar 20 14:44:05 crc kubenswrapper[4856]: I0320 14:44:05.091398 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-qq7b2"] Mar 20 14:44:05 crc kubenswrapper[4856]: I0320 14:44:05.836428 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371" path="/var/lib/kubelet/pods/21c7cbcf-ad8f-4ec6-b6e3-8a1c2ff83371/volumes" Mar 20 14:44:16 crc kubenswrapper[4856]: I0320 14:44:16.819697 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:44:16 crc kubenswrapper[4856]: E0320 14:44:16.820451 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:44:31 crc kubenswrapper[4856]: I0320 14:44:31.822468 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:44:31 crc kubenswrapper[4856]: E0320 14:44:31.823175 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:44:35 crc kubenswrapper[4856]: I0320 14:44:35.841231 4856 scope.go:117] "RemoveContainer" containerID="0d9ae4329313a74efcda8912ea6300ba8fcee2e9b7d9c4ab6ed28ad283569e20" Mar 20 14:44:42 crc kubenswrapper[4856]: I0320 14:44:42.819904 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:44:42 crc kubenswrapper[4856]: E0320 14:44:42.820821 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:44:57 crc kubenswrapper[4856]: I0320 14:44:57.820385 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:44:57 crc kubenswrapper[4856]: E0320 14:44:57.821154 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.143399 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj"] Mar 20 14:45:00 crc kubenswrapper[4856]: E0320 14:45:00.144260 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66ce2de7-9712-4031-b5c8-edbfe5c775cf" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.144297 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="66ce2de7-9712-4031-b5c8-edbfe5c775cf" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.144498 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="66ce2de7-9712-4031-b5c8-edbfe5c775cf" containerName="oc" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.145143 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.147355 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.148005 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.167241 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj"] Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.284990 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdkn\" (UniqueName: \"kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.285070 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.285189 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.386688 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.386819 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.386867 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdkn\" (UniqueName: \"kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.387604 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.392898 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.403190 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdkn\" (UniqueName: \"kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn\") pod \"collect-profiles-29566965-6mgzj\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.468321 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:00 crc kubenswrapper[4856]: I0320 14:45:00.873470 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj"] Mar 20 14:45:01 crc kubenswrapper[4856]: I0320 14:45:01.190151 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" event={"ID":"131a2338-cbb1-47df-a5b3-143f0d7e1ec0","Type":"ContainerStarted","Data":"fd9fdd6884a4a1840e1cfe30b20424a8f411a8ec9ea1f121c9cfd10fe1862191"} Mar 20 14:45:01 crc kubenswrapper[4856]: I0320 14:45:01.190678 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" event={"ID":"131a2338-cbb1-47df-a5b3-143f0d7e1ec0","Type":"ContainerStarted","Data":"66ded631358d8fe130c1abdd9f1def8370544cfe37af707c0d1f4af48748d980"} Mar 20 14:45:01 crc kubenswrapper[4856]: I0320 14:45:01.206837 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" podStartSLOduration=1.2068131659999999 podStartE2EDuration="1.206813166s" podCreationTimestamp="2026-03-20 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:01.20511772 +0000 UTC m=+4916.086143870" watchObservedRunningTime="2026-03-20 14:45:01.206813166 +0000 UTC m=+4916.087839296" Mar 20 14:45:02 crc kubenswrapper[4856]: I0320 14:45:02.198718 4856 generic.go:334] "Generic (PLEG): container finished" podID="131a2338-cbb1-47df-a5b3-143f0d7e1ec0" containerID="fd9fdd6884a4a1840e1cfe30b20424a8f411a8ec9ea1f121c9cfd10fe1862191" exitCode=0 Mar 20 14:45:02 crc kubenswrapper[4856]: I0320 14:45:02.198756 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" event={"ID":"131a2338-cbb1-47df-a5b3-143f0d7e1ec0","Type":"ContainerDied","Data":"fd9fdd6884a4a1840e1cfe30b20424a8f411a8ec9ea1f121c9cfd10fe1862191"} Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.511679 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.630286 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume\") pod \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.630385 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdkn\" (UniqueName: \"kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn\") pod \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.630501 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume\") pod \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\" (UID: \"131a2338-cbb1-47df-a5b3-143f0d7e1ec0\") " Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.631077 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume" (OuterVolumeSpecName: "config-volume") pod "131a2338-cbb1-47df-a5b3-143f0d7e1ec0" (UID: "131a2338-cbb1-47df-a5b3-143f0d7e1ec0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.636830 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn" (OuterVolumeSpecName: "kube-api-access-ftdkn") pod "131a2338-cbb1-47df-a5b3-143f0d7e1ec0" (UID: "131a2338-cbb1-47df-a5b3-143f0d7e1ec0"). InnerVolumeSpecName "kube-api-access-ftdkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.642489 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "131a2338-cbb1-47df-a5b3-143f0d7e1ec0" (UID: "131a2338-cbb1-47df-a5b3-143f0d7e1ec0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.731609 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdkn\" (UniqueName: \"kubernetes.io/projected/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-kube-api-access-ftdkn\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.731638 4856 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:03 crc kubenswrapper[4856]: I0320 14:45:03.731647 4856 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/131a2338-cbb1-47df-a5b3-143f0d7e1ec0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:04 crc kubenswrapper[4856]: I0320 14:45:04.214101 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" event={"ID":"131a2338-cbb1-47df-a5b3-143f0d7e1ec0","Type":"ContainerDied","Data":"66ded631358d8fe130c1abdd9f1def8370544cfe37af707c0d1f4af48748d980"} Mar 20 14:45:04 crc kubenswrapper[4856]: I0320 14:45:04.214143 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66ded631358d8fe130c1abdd9f1def8370544cfe37af707c0d1f4af48748d980" Mar 20 14:45:04 crc kubenswrapper[4856]: I0320 14:45:04.214178 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-6mgzj" Mar 20 14:45:04 crc kubenswrapper[4856]: I0320 14:45:04.285783 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs"] Mar 20 14:45:04 crc kubenswrapper[4856]: I0320 14:45:04.292956 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-vv5hs"] Mar 20 14:45:05 crc kubenswrapper[4856]: I0320 14:45:05.830189 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e31f256-88cb-4b92-948d-21602388727a" path="/var/lib/kubelet/pods/3e31f256-88cb-4b92-948d-21602388727a/volumes" Mar 20 14:45:08 crc kubenswrapper[4856]: I0320 14:45:08.820556 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:45:08 crc kubenswrapper[4856]: E0320 14:45:08.821439 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:45:23 crc kubenswrapper[4856]: I0320 14:45:23.823114 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:45:23 crc kubenswrapper[4856]: E0320 14:45:23.823846 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.868689 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:24 crc kubenswrapper[4856]: E0320 14:45:24.873647 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131a2338-cbb1-47df-a5b3-143f0d7e1ec0" containerName="collect-profiles" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.873691 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="131a2338-cbb1-47df-a5b3-143f0d7e1ec0" containerName="collect-profiles" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.874241 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="131a2338-cbb1-47df-a5b3-143f0d7e1ec0" containerName="collect-profiles" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.876662 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.885237 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.889678 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.889903 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.889968 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-z84db" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.919310 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-559b986c67-lcd2m"] Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.920783 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.924988 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.940057 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:24 crc kubenswrapper[4856]: I0320 14:45:24.966859 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-559b986c67-lcd2m"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.029998 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559b986c67-lcd2m"] Mar 20 14:45:25 crc kubenswrapper[4856]: E0320 14:45:25.030553 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-wzb8b], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-559b986c67-lcd2m" podUID="af4c8278-712d-4354-9410-9be0288fc18f" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.044101 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzb8b\" (UniqueName: \"kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.044217 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.044262 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d95d\" (UniqueName: \"kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.044337 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.044381 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.100991 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.102478 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.116372 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.145848 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.145921 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.145988 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzb8b\" (UniqueName: \"kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.146042 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.146069 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d95d\" (UniqueName: \"kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.147367 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.148077 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.148999 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.180224 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzb8b\" (UniqueName: \"kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b\") pod \"dnsmasq-dns-559b986c67-lcd2m\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.194042 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d95d\" (UniqueName: \"kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d\") pod \"dnsmasq-dns-58f44444cf-2l8xb\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.200774 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.252059 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdkg\" (UniqueName: \"kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.252151 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.252243 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.353715 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.353816 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.353889 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdkg\" (UniqueName: \"kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.354715 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.354732 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.372997 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.374956 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdkg\" (UniqueName: \"kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg\") pod \"dnsmasq-dns-5d7b5456f5-wf9gp\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.391987 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.418386 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.448668 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.480497 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.481938 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.506145 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.557507 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzb8b\" (UniqueName: \"kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b\") pod \"af4c8278-712d-4354-9410-9be0288fc18f\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.557592 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config\") pod \"af4c8278-712d-4354-9410-9be0288fc18f\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.557664 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc\") pod \"af4c8278-712d-4354-9410-9be0288fc18f\" (UID: \"af4c8278-712d-4354-9410-9be0288fc18f\") " Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.558333 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af4c8278-712d-4354-9410-9be0288fc18f" (UID: "af4c8278-712d-4354-9410-9be0288fc18f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.558579 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config" (OuterVolumeSpecName: "config") pod "af4c8278-712d-4354-9410-9be0288fc18f" (UID: "af4c8278-712d-4354-9410-9be0288fc18f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.558766 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.558782 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af4c8278-712d-4354-9410-9be0288fc18f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.568439 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b" (OuterVolumeSpecName: "kube-api-access-wzb8b") pod "af4c8278-712d-4354-9410-9be0288fc18f" (UID: "af4c8278-712d-4354-9410-9be0288fc18f"). InnerVolumeSpecName "kube-api-access-wzb8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.660000 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.660102 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.660122 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cz2t\" (UniqueName: \"kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.660206 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzb8b\" (UniqueName: \"kubernetes.io/projected/af4c8278-712d-4354-9410-9be0288fc18f-kube-api-access-wzb8b\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.761472 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.761538 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cz2t\" (UniqueName: \"kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.761596 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.762469 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.762476 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.779778 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cz2t\" (UniqueName: \"kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t\") pod \"dnsmasq-dns-98ddfc8f-p8np8\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.813903 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:25 crc kubenswrapper[4856]: W0320 14:45:25.814180 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4a82ab0_eb0b_4cdf_bee2_e0a85f0e9529.slice/crio-95cf570a922a711fc5ae888ecca7faddd2de608e027d066a08c4b2662915d7be WatchSource:0}: Error finding container 95cf570a922a711fc5ae888ecca7faddd2de608e027d066a08c4b2662915d7be: Status 404 returned error can't find the container with id 95cf570a922a711fc5ae888ecca7faddd2de608e027d066a08c4b2662915d7be Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.822906 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:25 crc kubenswrapper[4856]: I0320 14:45:25.962747 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:25 crc kubenswrapper[4856]: W0320 14:45:25.989450 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd07d285c_35c6_4e5b_aa2b_7f88a4685b7f.slice/crio-792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4 WatchSource:0}: Error finding container 792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4: Status 404 returned error can't find the container with id 792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4 Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.021248 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.022734 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.025383 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.025418 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.025383 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bt9fn" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.025917 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.032682 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.036448 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165731 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trhz\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-kube-api-access-7trhz\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165797 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e716ae39-1ad1-47c2-ac59-04b100421073-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165825 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165886 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165916 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165958 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.165981 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e716ae39-1ad1-47c2-ac59-04b100421073-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.166007 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e716ae39-1ad1-47c2-ac59-04b100421073-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.267865 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268281 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268306 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e716ae39-1ad1-47c2-ac59-04b100421073-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268344 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e716ae39-1ad1-47c2-ac59-04b100421073-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268386 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trhz\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-kube-api-access-7trhz\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268434 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e716ae39-1ad1-47c2-ac59-04b100421073-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268459 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.268509 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.269395 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.269432 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.269776 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e716ae39-1ad1-47c2-ac59-04b100421073-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.272386 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e716ae39-1ad1-47c2-ac59-04b100421073-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.272426 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.272509 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e716ae39-1ad1-47c2-ac59-04b100421073-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.274227 4856 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.274261 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f2311a35e00da0470e70dbd8eb411b9e16379ff9c293996b3cfd0504f2b2e569/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.295541 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trhz\" (UniqueName: \"kubernetes.io/projected/e716ae39-1ad1-47c2-ac59-04b100421073-kube-api-access-7trhz\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.297905 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:45:26 crc kubenswrapper[4856]: W0320 14:45:26.304965 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1718e84b_4ba3_4c92_b2bb_c9ab9987e62b.slice/crio-0ff592bb8011e0da49be12e7bde21e9753381de45a0f87312316880eb457fd88 WatchSource:0}: Error finding container 0ff592bb8011e0da49be12e7bde21e9753381de45a0f87312316880eb457fd88: Status 404 returned error can't find the container with id 0ff592bb8011e0da49be12e7bde21e9753381de45a0f87312316880eb457fd88 Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.314360 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-24891bf3-bfaa-4747-b1c1-951ef058b3d9\") pod \"rabbitmq-server-0\" (UID: \"e716ae39-1ad1-47c2-ac59-04b100421073\") " pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.353310 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.383635 4856 generic.go:334] "Generic (PLEG): container finished" podID="b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" containerID="45deb2b81d38f6a34691141270cc88948720ff720315f959e91fb50201fe11cd" exitCode=0 Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.383736 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" event={"ID":"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529","Type":"ContainerDied","Data":"45deb2b81d38f6a34691141270cc88948720ff720315f959e91fb50201fe11cd"} Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.383780 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" event={"ID":"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529","Type":"ContainerStarted","Data":"95cf570a922a711fc5ae888ecca7faddd2de608e027d066a08c4b2662915d7be"} Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.385287 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" event={"ID":"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b","Type":"ContainerStarted","Data":"0ff592bb8011e0da49be12e7bde21e9753381de45a0f87312316880eb457fd88"} Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.391688 4856 generic.go:334] "Generic (PLEG): container finished" podID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerID="51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4" exitCode=0 Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.391777 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-559b986c67-lcd2m" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.392143 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" event={"ID":"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f","Type":"ContainerDied","Data":"51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4"} Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.392186 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" event={"ID":"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f","Type":"ContainerStarted","Data":"792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4"} Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.415656 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.416934 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.421886 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.421895 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.422015 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.422146 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-s99cz" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.422188 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.445655 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.482353 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-559b986c67-lcd2m"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.493843 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-559b986c67-lcd2m"] Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573083 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573139 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573191 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573229 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573289 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573313 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573366 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.573384 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghr5n\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-kube-api-access-ghr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674334 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674611 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674656 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674682 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674711 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674726 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674756 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.674772 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghr5n\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-kube-api-access-ghr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.675449 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.676757 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.679387 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.681126 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.681141 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.681837 4856 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.681866 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e42969ddf82dc2f6d2cd40dabf9a2046b1ee567980b97812ea64a8bd724ab91a/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.683047 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: E0320 14:45:26.686196 4856 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 14:45:26 crc kubenswrapper[4856]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 14:45:26 crc kubenswrapper[4856]: > podSandboxID="792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4" Mar 20 14:45:26 crc kubenswrapper[4856]: E0320 14:45:26.686367 4856 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 14:45:26 crc kubenswrapper[4856]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8chc6h5bh56fh546hb7hc8h67h5bchffh577h697h5b5h5bdh59bhf6hf4h558hb5h578h595h5cchfbh644h59ch7fh654h547h587h5cbh5d5h8fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qxdkg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5d7b5456f5-wf9gp_openstack(d07d285c-35c6-4e5b-aa2b-7f88a4685b7f): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 14:45:26 crc kubenswrapper[4856]: > logger="UnhandledError" Mar 20 14:45:26 crc kubenswrapper[4856]: E0320 14:45:26.687627 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.698646 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghr5n\" (UniqueName: \"kubernetes.io/projected/b16d74bd-8cbc-4a22-a06d-a5b7c15859ab-kube-api-access-ghr5n\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.716049 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-444a0c5e-5a24-4b02-ac0a-bde7fd17daa7\") pod \"rabbitmq-cell1-server-0\" (UID: \"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.746749 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.766307 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.878374 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d95d\" (UniqueName: \"kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d\") pod \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.878588 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config\") pod \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\" (UID: \"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529\") " Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.881778 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d" (OuterVolumeSpecName: "kube-api-access-6d95d") pod "b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" (UID: "b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529"). InnerVolumeSpecName "kube-api-access-6d95d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.894311 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config" (OuterVolumeSpecName: "config") pod "b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" (UID: "b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.980123 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:26 crc kubenswrapper[4856]: I0320 14:45:26.980522 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d95d\" (UniqueName: \"kubernetes.io/projected/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529-kube-api-access-6d95d\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.001174 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 14:45:27 crc kubenswrapper[4856]: W0320 14:45:27.007600 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode716ae39_1ad1_47c2_ac59_04b100421073.slice/crio-fb86ce5de13c6909a6c425f7a3d8e40a0c326296eb5c2351f4ef7aaf71f8a519 WatchSource:0}: Error finding container fb86ce5de13c6909a6c425f7a3d8e40a0c326296eb5c2351f4ef7aaf71f8a519: Status 404 returned error can't find the container with id fb86ce5de13c6909a6c425f7a3d8e40a0c326296eb5c2351f4ef7aaf71f8a519 Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.190026 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 14:45:27 crc kubenswrapper[4856]: W0320 14:45:27.201235 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb16d74bd_8cbc_4a22_a06d_a5b7c15859ab.slice/crio-12109cec11cb6b402e431acc2166377185e14988bb59afcb48767bafffd05aa5 WatchSource:0}: Error finding container 12109cec11cb6b402e431acc2166377185e14988bb59afcb48767bafffd05aa5: Status 404 returned error can't find the container with id 12109cec11cb6b402e431acc2166377185e14988bb59afcb48767bafffd05aa5 Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.401945 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" event={"ID":"b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529","Type":"ContainerDied","Data":"95cf570a922a711fc5ae888ecca7faddd2de608e027d066a08c4b2662915d7be"} Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.401998 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58f44444cf-2l8xb" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.402002 4856 scope.go:117] "RemoveContainer" containerID="45deb2b81d38f6a34691141270cc88948720ff720315f959e91fb50201fe11cd" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.403673 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e716ae39-1ad1-47c2-ac59-04b100421073","Type":"ContainerStarted","Data":"fb86ce5de13c6909a6c425f7a3d8e40a0c326296eb5c2351f4ef7aaf71f8a519"} Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.405805 4856 generic.go:334] "Generic (PLEG): container finished" podID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerID="0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08" exitCode=0 Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.405898 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" event={"ID":"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b","Type":"ContainerDied","Data":"0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08"} Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.407792 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab","Type":"ContainerStarted","Data":"12109cec11cb6b402e431acc2166377185e14988bb59afcb48767bafffd05aa5"} Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.525657 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.532663 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58f44444cf-2l8xb"] Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.666366 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 14:45:27 crc kubenswrapper[4856]: E0320 14:45:27.666694 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" containerName="init" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.666705 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" containerName="init" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.666850 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" containerName="init" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.667689 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.674174 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.674360 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.674609 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.674796 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-9ztsd" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.678585 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.679343 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.795830 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-default\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.795909 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-5112f5f6-1e38-40f8-8515-1530584890a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5112f5f6-1e38-40f8-8515-1530584890a6\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.795957 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.795995 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-kolla-config\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.796023 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.796079 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt4lr\" (UniqueName: \"kubernetes.io/projected/700a6806-8beb-4f1b-8462-dd888f16714d-kube-api-access-dt4lr\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.796156 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.796179 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.831885 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af4c8278-712d-4354-9410-9be0288fc18f" path="/var/lib/kubelet/pods/af4c8278-712d-4354-9410-9be0288fc18f/volumes" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.832880 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529" path="/var/lib/kubelet/pods/b4a82ab0-eb0b-4cdf-bee2-e0a85f0e9529/volumes" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897621 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897668 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897710 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-default\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897746 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-5112f5f6-1e38-40f8-8515-1530584890a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5112f5f6-1e38-40f8-8515-1530584890a6\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897779 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897810 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-kolla-config\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897833 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.897869 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt4lr\" (UniqueName: \"kubernetes.io/projected/700a6806-8beb-4f1b-8462-dd888f16714d-kube-api-access-dt4lr\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.899349 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-generated\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.899657 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-kolla-config\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.899769 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-config-data-default\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.899986 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/700a6806-8beb-4f1b-8462-dd888f16714d-operator-scripts\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.903994 4856 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.904047 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-5112f5f6-1e38-40f8-8515-1530584890a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5112f5f6-1e38-40f8-8515-1530584890a6\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/02e589737901142a752aecb19c7e80d0893f415915ce1ac11e0fb162adfd3f92/globalmount\"" pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.939536 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:27 crc kubenswrapper[4856]: I0320 14:45:27.939649 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700a6806-8beb-4f1b-8462-dd888f16714d-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.039631 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt4lr\" (UniqueName: \"kubernetes.io/projected/700a6806-8beb-4f1b-8462-dd888f16714d-kube-api-access-dt4lr\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.120677 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.122548 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.124037 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.124258 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4jdxz" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.132058 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.202384 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc95p\" (UniqueName: \"kubernetes.io/projected/4b1216c9-a548-44fa-bf11-2e689bd7c575-kube-api-access-bc95p\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.202448 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-kolla-config\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.202478 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-config-data\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.260648 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-5112f5f6-1e38-40f8-8515-1530584890a6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-5112f5f6-1e38-40f8-8515-1530584890a6\") pod \"openstack-galera-0\" (UID: \"700a6806-8beb-4f1b-8462-dd888f16714d\") " pod="openstack/openstack-galera-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.304420 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc95p\" (UniqueName: \"kubernetes.io/projected/4b1216c9-a548-44fa-bf11-2e689bd7c575-kube-api-access-bc95p\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.304517 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-kolla-config\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.304563 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-config-data\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.305676 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-config-data\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.306616 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4b1216c9-a548-44fa-bf11-2e689bd7c575-kolla-config\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.322973 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc95p\" (UniqueName: \"kubernetes.io/projected/4b1216c9-a548-44fa-bf11-2e689bd7c575-kube-api-access-bc95p\") pod \"memcached-0\" (UID: \"4b1216c9-a548-44fa-bf11-2e689bd7c575\") " pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.401800 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.414694 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.420355 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e716ae39-1ad1-47c2-ac59-04b100421073","Type":"ContainerStarted","Data":"dfcfbaece45c1c1606c554eda4cd0568e204cf5bf9bd32d79621208593fb1e46"} Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.907654 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 14:45:28 crc kubenswrapper[4856]: W0320 14:45:28.909032 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod700a6806_8beb_4f1b_8462_dd888f16714d.slice/crio-9b24410fed43e4cb3317cd1a1d5aa9e7ef4d125cf36c87571c374124557bfc80 WatchSource:0}: Error finding container 9b24410fed43e4cb3317cd1a1d5aa9e7ef4d125cf36c87571c374124557bfc80: Status 404 returned error can't find the container with id 9b24410fed43e4cb3317cd1a1d5aa9e7ef4d125cf36c87571c374124557bfc80 Mar 20 14:45:28 crc kubenswrapper[4856]: I0320 14:45:28.948003 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 14:45:28 crc kubenswrapper[4856]: W0320 14:45:28.954699 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b1216c9_a548_44fa_bf11_2e689bd7c575.slice/crio-e5618983acb6ab768613c46ba5016f2352d33a4c0d3843cdaf2b3369d95ad3ec WatchSource:0}: Error finding container e5618983acb6ab768613c46ba5016f2352d33a4c0d3843cdaf2b3369d95ad3ec: Status 404 returned error can't find the container with id e5618983acb6ab768613c46ba5016f2352d33a4c0d3843cdaf2b3369d95ad3ec Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.168162 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.170431 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.172714 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cqts6" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.172796 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.173172 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.175567 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.180484 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234225 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234299 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234344 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xw8z\" (UniqueName: \"kubernetes.io/projected/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kube-api-access-6xw8z\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234376 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234396 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234598 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234659 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.234703 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.336574 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xw8z\" (UniqueName: \"kubernetes.io/projected/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kube-api-access-6xw8z\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.336664 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.336688 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337511 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337548 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337572 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337617 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337764 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337789 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.337818 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.338745 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.339386 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.341064 4856 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.341107 4856 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9aa51fa1939dbda7097732641efa89212aea448011641096febbb5df86b19b0d/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.344905 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.350043 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.353818 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xw8z\" (UniqueName: \"kubernetes.io/projected/a4de8ddb-91b7-44f4-a98f-aba9d1f2527d-kube-api-access-6xw8z\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.368688 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-15d5ebb6-4030-407f-9bef-c7f20037603f\") pod \"openstack-cell1-galera-0\" (UID: \"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d\") " pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.430898 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" event={"ID":"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f","Type":"ContainerStarted","Data":"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.431207 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.432752 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"700a6806-8beb-4f1b-8462-dd888f16714d","Type":"ContainerStarted","Data":"789b398872e5c1de51b98c7877571f56a17f194a7c2aa4655cb00682223fd368"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.432788 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"700a6806-8beb-4f1b-8462-dd888f16714d","Type":"ContainerStarted","Data":"9b24410fed43e4cb3317cd1a1d5aa9e7ef4d125cf36c87571c374124557bfc80"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.435161 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" event={"ID":"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b","Type":"ContainerStarted","Data":"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.435564 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.437524 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4b1216c9-a548-44fa-bf11-2e689bd7c575","Type":"ContainerStarted","Data":"c1b3e3dbf569219a32b68230aab944ebd4318a09425b35d5ef92433246eaf19f"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.437549 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"4b1216c9-a548-44fa-bf11-2e689bd7c575","Type":"ContainerStarted","Data":"e5618983acb6ab768613c46ba5016f2352d33a4c0d3843cdaf2b3369d95ad3ec"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.437904 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.439348 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab","Type":"ContainerStarted","Data":"6f5cffcf986df6c27ffbef70a2266993b5ae36d12582435a471dcc722bf4b7b8"} Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.447360 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" podStartSLOduration=4.447325125 podStartE2EDuration="4.447325125s" podCreationTimestamp="2026-03-20 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:29.44637535 +0000 UTC m=+4944.327401540" watchObservedRunningTime="2026-03-20 14:45:29.447325125 +0000 UTC m=+4944.328351275" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.468528 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" podStartSLOduration=4.468509904 podStartE2EDuration="4.468509904s" podCreationTimestamp="2026-03-20 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:29.462260483 +0000 UTC m=+4944.343286633" watchObservedRunningTime="2026-03-20 14:45:29.468509904 +0000 UTC m=+4944.349536034" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.493942 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.506982 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=1.5069636320000002 podStartE2EDuration="1.506963632s" podCreationTimestamp="2026-03-20 14:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:29.503708393 +0000 UTC m=+4944.384734543" watchObservedRunningTime="2026-03-20 14:45:29.506963632 +0000 UTC m=+4944.387989782" Mar 20 14:45:29 crc kubenswrapper[4856]: I0320 14:45:29.915929 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 14:45:29 crc kubenswrapper[4856]: W0320 14:45:29.946151 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4de8ddb_91b7_44f4_a98f_aba9d1f2527d.slice/crio-278e5df58f0c3c817f8a76571be146ac6eb3c09b5f6257315cf24d3ea297e4a0 WatchSource:0}: Error finding container 278e5df58f0c3c817f8a76571be146ac6eb3c09b5f6257315cf24d3ea297e4a0: Status 404 returned error can't find the container with id 278e5df58f0c3c817f8a76571be146ac6eb3c09b5f6257315cf24d3ea297e4a0 Mar 20 14:45:30 crc kubenswrapper[4856]: I0320 14:45:30.447085 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d","Type":"ContainerStarted","Data":"ef510d92f6431b04d5d8a2572976a6710da6afd9108b2b1077b890ae0000f05d"} Mar 20 14:45:30 crc kubenswrapper[4856]: I0320 14:45:30.447509 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d","Type":"ContainerStarted","Data":"278e5df58f0c3c817f8a76571be146ac6eb3c09b5f6257315cf24d3ea297e4a0"} Mar 20 14:45:32 crc kubenswrapper[4856]: I0320 14:45:32.461725 4856 generic.go:334] "Generic (PLEG): container finished" podID="700a6806-8beb-4f1b-8462-dd888f16714d" containerID="789b398872e5c1de51b98c7877571f56a17f194a7c2aa4655cb00682223fd368" exitCode=0 Mar 20 14:45:32 crc kubenswrapper[4856]: I0320 14:45:32.461761 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"700a6806-8beb-4f1b-8462-dd888f16714d","Type":"ContainerDied","Data":"789b398872e5c1de51b98c7877571f56a17f194a7c2aa4655cb00682223fd368"} Mar 20 14:45:33 crc kubenswrapper[4856]: I0320 14:45:33.471235 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"700a6806-8beb-4f1b-8462-dd888f16714d","Type":"ContainerStarted","Data":"561af2c7408489ee16c735e55cb531fefa972717aad7cbf768b5e8e786dd478e"} Mar 20 14:45:33 crc kubenswrapper[4856]: I0320 14:45:33.495394 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.495372292 podStartE2EDuration="7.495372292s" podCreationTimestamp="2026-03-20 14:45:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:33.489341937 +0000 UTC m=+4948.370368117" watchObservedRunningTime="2026-03-20 14:45:33.495372292 +0000 UTC m=+4948.376398422" Mar 20 14:45:34 crc kubenswrapper[4856]: I0320 14:45:34.483196 4856 generic.go:334] "Generic (PLEG): container finished" podID="a4de8ddb-91b7-44f4-a98f-aba9d1f2527d" containerID="ef510d92f6431b04d5d8a2572976a6710da6afd9108b2b1077b890ae0000f05d" exitCode=0 Mar 20 14:45:34 crc kubenswrapper[4856]: I0320 14:45:34.483259 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d","Type":"ContainerDied","Data":"ef510d92f6431b04d5d8a2572976a6710da6afd9108b2b1077b890ae0000f05d"} Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.421616 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.497567 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"a4de8ddb-91b7-44f4-a98f-aba9d1f2527d","Type":"ContainerStarted","Data":"2d49b4b8c4f26b5362c2f015371014c6fc8573338b9464fb9e215c0a2b922590"} Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.526521 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.526499562 podStartE2EDuration="7.526499562s" podCreationTimestamp="2026-03-20 14:45:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:35.520931871 +0000 UTC m=+4950.401958031" watchObservedRunningTime="2026-03-20 14:45:35.526499562 +0000 UTC m=+4950.407525692" Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.824593 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:45:35 crc kubenswrapper[4856]: E0320 14:45:35.824802 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.829231 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.907168 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.907417 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="dnsmasq-dns" containerID="cri-o://54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8" gracePeriod=10 Mar 20 14:45:35 crc kubenswrapper[4856]: I0320 14:45:35.926906 4856 scope.go:117] "RemoveContainer" containerID="b16de28596509ca45de50ad52974e3e732329e5400c9ea35da97ba81b59777b9" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.329157 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.449505 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config\") pod \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.449625 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxdkg\" (UniqueName: \"kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg\") pod \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.449667 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc\") pod \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\" (UID: \"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f\") " Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.456496 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg" (OuterVolumeSpecName: "kube-api-access-qxdkg") pod "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" (UID: "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f"). InnerVolumeSpecName "kube-api-access-qxdkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.478918 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" (UID: "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.483477 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config" (OuterVolumeSpecName: "config") pod "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" (UID: "d07d285c-35c6-4e5b-aa2b-7f88a4685b7f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.507049 4856 generic.go:334] "Generic (PLEG): container finished" podID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerID="54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8" exitCode=0 Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.507106 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" event={"ID":"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f","Type":"ContainerDied","Data":"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8"} Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.507138 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.507158 4856 scope.go:117] "RemoveContainer" containerID="54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.507145 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-wf9gp" event={"ID":"d07d285c-35c6-4e5b-aa2b-7f88a4685b7f","Type":"ContainerDied","Data":"792cef9aa70a70ff8b71c4b328aff4606c1f44dc2a64e619ccc22e61239615e4"} Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.522452 4856 scope.go:117] "RemoveContainer" containerID="51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.541962 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.544418 4856 scope.go:117] "RemoveContainer" containerID="54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8" Mar 20 14:45:36 crc kubenswrapper[4856]: E0320 14:45:36.544750 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8\": container with ID starting with 54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8 not found: ID does not exist" containerID="54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.544798 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8"} err="failed to get container status \"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8\": rpc error: code = NotFound desc = could not find container \"54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8\": container with ID starting with 54be00cfc30965cdb1acbfaaa2e79d1391bb6db346194d275494cf22bc09c4b8 not found: ID does not exist" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.544829 4856 scope.go:117] "RemoveContainer" containerID="51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4" Mar 20 14:45:36 crc kubenswrapper[4856]: E0320 14:45:36.545228 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4\": container with ID starting with 51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4 not found: ID does not exist" containerID="51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.545289 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4"} err="failed to get container status \"51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4\": rpc error: code = NotFound desc = could not find container \"51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4\": container with ID starting with 51d2453a6d91235fa47d7384ed32b2dd52e592742d28ac47143319bd40de1eb4 not found: ID does not exist" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.548660 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-wf9gp"] Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.551073 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxdkg\" (UniqueName: \"kubernetes.io/projected/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-kube-api-access-qxdkg\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.551101 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:36 crc kubenswrapper[4856]: I0320 14:45:36.551112 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:37 crc kubenswrapper[4856]: I0320 14:45:37.828397 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" path="/var/lib/kubelet/pods/d07d285c-35c6-4e5b-aa2b-7f88a4685b7f/volumes" Mar 20 14:45:38 crc kubenswrapper[4856]: I0320 14:45:38.402517 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 14:45:38 crc kubenswrapper[4856]: I0320 14:45:38.402912 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 14:45:38 crc kubenswrapper[4856]: I0320 14:45:38.416431 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 14:45:39 crc kubenswrapper[4856]: I0320 14:45:39.495002 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:39 crc kubenswrapper[4856]: I0320 14:45:39.495069 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:39 crc kubenswrapper[4856]: I0320 14:45:39.574537 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:39 crc kubenswrapper[4856]: I0320 14:45:39.666686 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 14:45:40 crc kubenswrapper[4856]: I0320 14:45:40.705075 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 14:45:40 crc kubenswrapper[4856]: I0320 14:45:40.786410 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.609052 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-z777c"] Mar 20 14:45:46 crc kubenswrapper[4856]: E0320 14:45:46.609846 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="init" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.609867 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="init" Mar 20 14:45:46 crc kubenswrapper[4856]: E0320 14:45:46.609892 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="dnsmasq-dns" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.609903 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="dnsmasq-dns" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.610142 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07d285c-35c6-4e5b-aa2b-7f88a4685b7f" containerName="dnsmasq-dns" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.610935 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.614415 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.624988 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z777c"] Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.705347 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mdlp\" (UniqueName: \"kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.705437 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.806941 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.807452 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mdlp\" (UniqueName: \"kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.808316 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.832115 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mdlp\" (UniqueName: \"kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp\") pod \"root-account-create-update-z777c\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " pod="openstack/root-account-create-update-z777c" Mar 20 14:45:46 crc kubenswrapper[4856]: I0320 14:45:46.943087 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z777c" Mar 20 14:45:47 crc kubenswrapper[4856]: I0320 14:45:47.357199 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-z777c"] Mar 20 14:45:47 crc kubenswrapper[4856]: W0320 14:45:47.360634 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1f4624a_3660_417e_b07f_dac09f9bd711.slice/crio-7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905 WatchSource:0}: Error finding container 7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905: Status 404 returned error can't find the container with id 7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905 Mar 20 14:45:47 crc kubenswrapper[4856]: I0320 14:45:47.603367 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z777c" event={"ID":"c1f4624a-3660-417e-b07f-dac09f9bd711","Type":"ContainerStarted","Data":"851a84dfe498dc85cf933494515139ec3053841416f8b8115be21e7c4e64d870"} Mar 20 14:45:47 crc kubenswrapper[4856]: I0320 14:45:47.603418 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z777c" event={"ID":"c1f4624a-3660-417e-b07f-dac09f9bd711","Type":"ContainerStarted","Data":"7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905"} Mar 20 14:45:47 crc kubenswrapper[4856]: I0320 14:45:47.625376 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-z777c" podStartSLOduration=1.6253554609999998 podStartE2EDuration="1.625355461s" podCreationTimestamp="2026-03-20 14:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:45:47.619175613 +0000 UTC m=+4962.500201743" watchObservedRunningTime="2026-03-20 14:45:47.625355461 +0000 UTC m=+4962.506381601" Mar 20 14:45:47 crc kubenswrapper[4856]: I0320 14:45:47.820743 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:45:48 crc kubenswrapper[4856]: I0320 14:45:48.614141 4856 generic.go:334] "Generic (PLEG): container finished" podID="c1f4624a-3660-417e-b07f-dac09f9bd711" containerID="851a84dfe498dc85cf933494515139ec3053841416f8b8115be21e7c4e64d870" exitCode=0 Mar 20 14:45:48 crc kubenswrapper[4856]: I0320 14:45:48.614192 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z777c" event={"ID":"c1f4624a-3660-417e-b07f-dac09f9bd711","Type":"ContainerDied","Data":"851a84dfe498dc85cf933494515139ec3053841416f8b8115be21e7c4e64d870"} Mar 20 14:45:48 crc kubenswrapper[4856]: I0320 14:45:48.617661 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e"} Mar 20 14:45:49 crc kubenswrapper[4856]: I0320 14:45:49.917517 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z777c" Mar 20 14:45:49 crc kubenswrapper[4856]: I0320 14:45:49.950322 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mdlp\" (UniqueName: \"kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp\") pod \"c1f4624a-3660-417e-b07f-dac09f9bd711\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " Mar 20 14:45:49 crc kubenswrapper[4856]: I0320 14:45:49.950646 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts\") pod \"c1f4624a-3660-417e-b07f-dac09f9bd711\" (UID: \"c1f4624a-3660-417e-b07f-dac09f9bd711\") " Mar 20 14:45:49 crc kubenswrapper[4856]: I0320 14:45:49.951284 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1f4624a-3660-417e-b07f-dac09f9bd711" (UID: "c1f4624a-3660-417e-b07f-dac09f9bd711"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:49 crc kubenswrapper[4856]: I0320 14:45:49.956468 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp" (OuterVolumeSpecName: "kube-api-access-6mdlp") pod "c1f4624a-3660-417e-b07f-dac09f9bd711" (UID: "c1f4624a-3660-417e-b07f-dac09f9bd711"). InnerVolumeSpecName "kube-api-access-6mdlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:50 crc kubenswrapper[4856]: I0320 14:45:50.062216 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1f4624a-3660-417e-b07f-dac09f9bd711-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:50 crc kubenswrapper[4856]: I0320 14:45:50.062259 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mdlp\" (UniqueName: \"kubernetes.io/projected/c1f4624a-3660-417e-b07f-dac09f9bd711-kube-api-access-6mdlp\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:50 crc kubenswrapper[4856]: I0320 14:45:50.637089 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-z777c" event={"ID":"c1f4624a-3660-417e-b07f-dac09f9bd711","Type":"ContainerDied","Data":"7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905"} Mar 20 14:45:50 crc kubenswrapper[4856]: I0320 14:45:50.637139 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bee97f0d852cc1eea43322d091afa94527cf940671af058153faaea02159905" Mar 20 14:45:50 crc kubenswrapper[4856]: I0320 14:45:50.637701 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-z777c" Mar 20 14:45:53 crc kubenswrapper[4856]: I0320 14:45:53.138061 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-z777c"] Mar 20 14:45:53 crc kubenswrapper[4856]: I0320 14:45:53.143681 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-z777c"] Mar 20 14:45:53 crc kubenswrapper[4856]: I0320 14:45:53.832158 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f4624a-3660-417e-b07f-dac09f9bd711" path="/var/lib/kubelet/pods/c1f4624a-3660-417e-b07f-dac09f9bd711/volumes" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.160456 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-52c57"] Mar 20 14:45:58 crc kubenswrapper[4856]: E0320 14:45:58.161129 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f4624a-3660-417e-b07f-dac09f9bd711" containerName="mariadb-account-create-update" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.161149 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f4624a-3660-417e-b07f-dac09f9bd711" containerName="mariadb-account-create-update" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.161387 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f4624a-3660-417e-b07f-dac09f9bd711" containerName="mariadb-account-create-update" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.162485 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.165794 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.174812 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-52c57"] Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.295121 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.295296 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6jbz\" (UniqueName: \"kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.398238 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.398321 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6jbz\" (UniqueName: \"kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.399060 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.423664 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6jbz\" (UniqueName: \"kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz\") pod \"root-account-create-update-52c57\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.493435 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-52c57" Mar 20 14:45:58 crc kubenswrapper[4856]: I0320 14:45:58.904861 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-52c57"] Mar 20 14:45:59 crc kubenswrapper[4856]: I0320 14:45:59.705152 4856 generic.go:334] "Generic (PLEG): container finished" podID="284c63cf-8405-466d-88c5-97ededc321bc" containerID="ed7e3d7112437fba065d44f6016fef675df6f9ebd8228f88f8ebb5e8ad76e7c5" exitCode=0 Mar 20 14:45:59 crc kubenswrapper[4856]: I0320 14:45:59.705201 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-52c57" event={"ID":"284c63cf-8405-466d-88c5-97ededc321bc","Type":"ContainerDied","Data":"ed7e3d7112437fba065d44f6016fef675df6f9ebd8228f88f8ebb5e8ad76e7c5"} Mar 20 14:45:59 crc kubenswrapper[4856]: I0320 14:45:59.705656 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-52c57" event={"ID":"284c63cf-8405-466d-88c5-97ededc321bc","Type":"ContainerStarted","Data":"1704fee38c778a140c17eae766a2374b0013ca2fbd4e4ff90a5f371428432d9f"} Mar 20 14:45:59 crc kubenswrapper[4856]: I0320 14:45:59.708722 4856 generic.go:334] "Generic (PLEG): container finished" podID="e716ae39-1ad1-47c2-ac59-04b100421073" containerID="dfcfbaece45c1c1606c554eda4cd0568e204cf5bf9bd32d79621208593fb1e46" exitCode=0 Mar 20 14:45:59 crc kubenswrapper[4856]: I0320 14:45:59.708787 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e716ae39-1ad1-47c2-ac59-04b100421073","Type":"ContainerDied","Data":"dfcfbaece45c1c1606c554eda4cd0568e204cf5bf9bd32d79621208593fb1e46"} Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.130658 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566966-rt5td"] Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.131608 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.133857 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.134068 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.134197 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.143261 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-rt5td"] Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.223943 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz96n\" (UniqueName: \"kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n\") pod \"auto-csr-approver-29566966-rt5td\" (UID: \"c57cc40d-649a-4dff-8eeb-1483c7a72bf4\") " pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.326254 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz96n\" (UniqueName: \"kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n\") pod \"auto-csr-approver-29566966-rt5td\" (UID: \"c57cc40d-649a-4dff-8eeb-1483c7a72bf4\") " pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.352893 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz96n\" (UniqueName: \"kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n\") pod \"auto-csr-approver-29566966-rt5td\" (UID: \"c57cc40d-649a-4dff-8eeb-1483c7a72bf4\") " pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.455164 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.716623 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e716ae39-1ad1-47c2-ac59-04b100421073","Type":"ContainerStarted","Data":"124e4ca5b19c63f6a31f6330368d41635e9cbd72afbc5e2a6c978ac105d94511"} Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.717074 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.718843 4856 generic.go:334] "Generic (PLEG): container finished" podID="b16d74bd-8cbc-4a22-a06d-a5b7c15859ab" containerID="6f5cffcf986df6c27ffbef70a2266993b5ae36d12582435a471dcc722bf4b7b8" exitCode=0 Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.718945 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab","Type":"ContainerDied","Data":"6f5cffcf986df6c27ffbef70a2266993b5ae36d12582435a471dcc722bf4b7b8"} Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.767761 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.767739054 podStartE2EDuration="36.767739054s" podCreationTimestamp="2026-03-20 14:45:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:46:00.757617938 +0000 UTC m=+4975.638644078" watchObservedRunningTime="2026-03-20 14:46:00.767739054 +0000 UTC m=+4975.648765194" Mar 20 14:46:00 crc kubenswrapper[4856]: I0320 14:46:00.874470 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-rt5td"] Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.010670 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-52c57" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.140728 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6jbz\" (UniqueName: \"kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz\") pod \"284c63cf-8405-466d-88c5-97ededc321bc\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.140897 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts\") pod \"284c63cf-8405-466d-88c5-97ededc321bc\" (UID: \"284c63cf-8405-466d-88c5-97ededc321bc\") " Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.141718 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "284c63cf-8405-466d-88c5-97ededc321bc" (UID: "284c63cf-8405-466d-88c5-97ededc321bc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.146506 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz" (OuterVolumeSpecName: "kube-api-access-n6jbz") pod "284c63cf-8405-466d-88c5-97ededc321bc" (UID: "284c63cf-8405-466d-88c5-97ededc321bc"). InnerVolumeSpecName "kube-api-access-n6jbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.242991 4856 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/284c63cf-8405-466d-88c5-97ededc321bc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.243033 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6jbz\" (UniqueName: \"kubernetes.io/projected/284c63cf-8405-466d-88c5-97ededc321bc-kube-api-access-n6jbz\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.731250 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-52c57" event={"ID":"284c63cf-8405-466d-88c5-97ededc321bc","Type":"ContainerDied","Data":"1704fee38c778a140c17eae766a2374b0013ca2fbd4e4ff90a5f371428432d9f"} Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.731299 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1704fee38c778a140c17eae766a2374b0013ca2fbd4e4ff90a5f371428432d9f" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.731346 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-52c57" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.740942 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-rt5td" event={"ID":"c57cc40d-649a-4dff-8eeb-1483c7a72bf4","Type":"ContainerStarted","Data":"ba49d94775f96efa9356a11338c59d2c51bcad6e8deddbb2dfd9786cbe926c99"} Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.743378 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b16d74bd-8cbc-4a22-a06d-a5b7c15859ab","Type":"ContainerStarted","Data":"5d5d9c6988d57d349065bf157521b79788d29c323a95019962d8f698da296a37"} Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.743995 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:46:01 crc kubenswrapper[4856]: I0320 14:46:01.771634 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.771610226 podStartE2EDuration="36.771610226s" podCreationTimestamp="2026-03-20 14:45:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:46:01.767150555 +0000 UTC m=+4976.648176705" watchObservedRunningTime="2026-03-20 14:46:01.771610226 +0000 UTC m=+4976.652636356" Mar 20 14:46:02 crc kubenswrapper[4856]: I0320 14:46:02.752193 4856 generic.go:334] "Generic (PLEG): container finished" podID="c57cc40d-649a-4dff-8eeb-1483c7a72bf4" containerID="ea5651f9695f7595779da1e181e5134bef63fe5eb571fdc510c6d9e42ba33052" exitCode=0 Mar 20 14:46:02 crc kubenswrapper[4856]: I0320 14:46:02.752248 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-rt5td" event={"ID":"c57cc40d-649a-4dff-8eeb-1483c7a72bf4","Type":"ContainerDied","Data":"ea5651f9695f7595779da1e181e5134bef63fe5eb571fdc510c6d9e42ba33052"} Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.088893 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.195110 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz96n\" (UniqueName: \"kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n\") pod \"c57cc40d-649a-4dff-8eeb-1483c7a72bf4\" (UID: \"c57cc40d-649a-4dff-8eeb-1483c7a72bf4\") " Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.203518 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n" (OuterVolumeSpecName: "kube-api-access-zz96n") pod "c57cc40d-649a-4dff-8eeb-1483c7a72bf4" (UID: "c57cc40d-649a-4dff-8eeb-1483c7a72bf4"). InnerVolumeSpecName "kube-api-access-zz96n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.296973 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz96n\" (UniqueName: \"kubernetes.io/projected/c57cc40d-649a-4dff-8eeb-1483c7a72bf4-kube-api-access-zz96n\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.767194 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-rt5td" event={"ID":"c57cc40d-649a-4dff-8eeb-1483c7a72bf4","Type":"ContainerDied","Data":"ba49d94775f96efa9356a11338c59d2c51bcad6e8deddbb2dfd9786cbe926c99"} Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.767237 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba49d94775f96efa9356a11338c59d2c51bcad6e8deddbb2dfd9786cbe926c99" Mar 20 14:46:04 crc kubenswrapper[4856]: I0320 14:46:04.767238 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-rt5td" Mar 20 14:46:05 crc kubenswrapper[4856]: I0320 14:46:05.153828 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-f7pbd"] Mar 20 14:46:05 crc kubenswrapper[4856]: I0320 14:46:05.159858 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-f7pbd"] Mar 20 14:46:05 crc kubenswrapper[4856]: I0320 14:46:05.828933 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1c9f117-237f-4b0a-9bfc-b6d3f6308a85" path="/var/lib/kubelet/pods/a1c9f117-237f-4b0a-9bfc-b6d3f6308a85/volumes" Mar 20 14:46:16 crc kubenswrapper[4856]: I0320 14:46:16.357494 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 14:46:16 crc kubenswrapper[4856]: I0320 14:46:16.750491 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.051422 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-l7lkj"] Mar 20 14:46:23 crc kubenswrapper[4856]: E0320 14:46:23.052313 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c57cc40d-649a-4dff-8eeb-1483c7a72bf4" containerName="oc" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.052327 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="c57cc40d-649a-4dff-8eeb-1483c7a72bf4" containerName="oc" Mar 20 14:46:23 crc kubenswrapper[4856]: E0320 14:46:23.052345 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="284c63cf-8405-466d-88c5-97ededc321bc" containerName="mariadb-account-create-update" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.052353 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="284c63cf-8405-466d-88c5-97ededc321bc" containerName="mariadb-account-create-update" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.052508 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="284c63cf-8405-466d-88c5-97ededc321bc" containerName="mariadb-account-create-update" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.052545 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="c57cc40d-649a-4dff-8eeb-1483c7a72bf4" containerName="oc" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.053353 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.070142 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-l7lkj"] Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.200682 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.200792 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-config\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.200839 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x69zc\" (UniqueName: \"kubernetes.io/projected/33844cc5-c637-4214-b684-af1d3a8228fc-kube-api-access-x69zc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.302733 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x69zc\" (UniqueName: \"kubernetes.io/projected/33844cc5-c637-4214-b684-af1d3a8228fc-kube-api-access-x69zc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.302898 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.302935 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-config\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.303692 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-config\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.303965 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33844cc5-c637-4214-b684-af1d3a8228fc-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.320930 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x69zc\" (UniqueName: \"kubernetes.io/projected/33844cc5-c637-4214-b684-af1d3a8228fc-kube-api-access-x69zc\") pod \"dnsmasq-dns-5b7946d7b9-l7lkj\" (UID: \"33844cc5-c637-4214-b684-af1d3a8228fc\") " pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.380463 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.835252 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-l7lkj"] Mar 20 14:46:23 crc kubenswrapper[4856]: I0320 14:46:23.929602 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" event={"ID":"33844cc5-c637-4214-b684-af1d3a8228fc","Type":"ContainerStarted","Data":"ab3ea3f3bad3ee3a90b16b6fd49a4f9133cd8da13f5adb1907461650d3ba298a"} Mar 20 14:46:24 crc kubenswrapper[4856]: E0320 14:46:24.211604 4856 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33844cc5_c637_4214_b684_af1d3a8228fc.slice/crio-conmon-62604a01bf839e3f50eb46a0530e33fafc357ff53a53aac1290ae632e2d0599f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:46:24 crc kubenswrapper[4856]: I0320 14:46:24.938557 4856 generic.go:334] "Generic (PLEG): container finished" podID="33844cc5-c637-4214-b684-af1d3a8228fc" containerID="62604a01bf839e3f50eb46a0530e33fafc357ff53a53aac1290ae632e2d0599f" exitCode=0 Mar 20 14:46:24 crc kubenswrapper[4856]: I0320 14:46:24.938611 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" event={"ID":"33844cc5-c637-4214-b684-af1d3a8228fc","Type":"ContainerDied","Data":"62604a01bf839e3f50eb46a0530e33fafc357ff53a53aac1290ae632e2d0599f"} Mar 20 14:46:25 crc kubenswrapper[4856]: I0320 14:46:25.948375 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" event={"ID":"33844cc5-c637-4214-b684-af1d3a8228fc","Type":"ContainerStarted","Data":"32146a06ca55017e3204d45980a7658c89e4d4e7821e363695383d07475ac2f1"} Mar 20 14:46:25 crc kubenswrapper[4856]: I0320 14:46:25.948682 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:25 crc kubenswrapper[4856]: I0320 14:46:25.967157 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" podStartSLOduration=2.967137267 podStartE2EDuration="2.967137267s" podCreationTimestamp="2026-03-20 14:46:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:46:25.963700153 +0000 UTC m=+5000.844726303" watchObservedRunningTime="2026-03-20 14:46:25.967137267 +0000 UTC m=+5000.848163397" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.311201 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.314457 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.327496 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.417045 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.417183 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.417332 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.519360 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.519467 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.519515 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.520098 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.520131 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.544745 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw\") pod \"redhat-marketplace-lvw7z\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:30 crc kubenswrapper[4856]: I0320 14:46:30.632234 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:31 crc kubenswrapper[4856]: I0320 14:46:31.110386 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:31 crc kubenswrapper[4856]: I0320 14:46:31.990692 4856 generic.go:334] "Generic (PLEG): container finished" podID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerID="60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e" exitCode=0 Mar 20 14:46:31 crc kubenswrapper[4856]: I0320 14:46:31.990880 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerDied","Data":"60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e"} Mar 20 14:46:31 crc kubenswrapper[4856]: I0320 14:46:31.990989 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerStarted","Data":"c5ca41522125036b54e4eda5d929875182f17645eeae0d201e7e1cdeddb7f509"} Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.001689 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerStarted","Data":"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b"} Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.382415 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-l7lkj" Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.434036 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.434298 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="dnsmasq-dns" containerID="cri-o://00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e" gracePeriod=10 Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.840762 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.972954 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc\") pod \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.973006 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config\") pod \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.973044 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cz2t\" (UniqueName: \"kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t\") pod \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\" (UID: \"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b\") " Mar 20 14:46:33 crc kubenswrapper[4856]: I0320 14:46:33.978309 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t" (OuterVolumeSpecName: "kube-api-access-7cz2t") pod "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" (UID: "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b"). InnerVolumeSpecName "kube-api-access-7cz2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.009533 4856 generic.go:334] "Generic (PLEG): container finished" podID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerID="e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b" exitCode=0 Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.009631 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerDied","Data":"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b"} Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.013454 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" (UID: "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.016821 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config" (OuterVolumeSpecName: "config") pod "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" (UID: "1718e84b-4ba3-4c92-b2bb-c9ab9987e62b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.018180 4856 generic.go:334] "Generic (PLEG): container finished" podID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerID="00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e" exitCode=0 Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.018230 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" event={"ID":"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b","Type":"ContainerDied","Data":"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e"} Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.018260 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" event={"ID":"1718e84b-4ba3-4c92-b2bb-c9ab9987e62b","Type":"ContainerDied","Data":"0ff592bb8011e0da49be12e7bde21e9753381de45a0f87312316880eb457fd88"} Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.018297 4856 scope.go:117] "RemoveContainer" containerID="00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.018496 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-p8np8" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.082737 4856 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.082798 4856 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.082872 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cz2t\" (UniqueName: \"kubernetes.io/projected/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b-kube-api-access-7cz2t\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.086078 4856 scope.go:117] "RemoveContainer" containerID="0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.105125 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.111097 4856 scope.go:117] "RemoveContainer" containerID="00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e" Mar 20 14:46:34 crc kubenswrapper[4856]: E0320 14:46:34.111539 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e\": container with ID starting with 00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e not found: ID does not exist" containerID="00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.111573 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e"} err="failed to get container status \"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e\": rpc error: code = NotFound desc = could not find container \"00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e\": container with ID starting with 00353202eaba242cdbdbfd84b0f37c530d9bea08597be7a35a9aaa2389caa94e not found: ID does not exist" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.111598 4856 scope.go:117] "RemoveContainer" containerID="0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08" Mar 20 14:46:34 crc kubenswrapper[4856]: E0320 14:46:34.111922 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08\": container with ID starting with 0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08 not found: ID does not exist" containerID="0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.111953 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08"} err="failed to get container status \"0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08\": rpc error: code = NotFound desc = could not find container \"0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08\": container with ID starting with 0eac410379f146d28de0bc67f6641b3da2a6390b82b86bc44b087a76e04e8c08 not found: ID does not exist" Mar 20 14:46:34 crc kubenswrapper[4856]: I0320 14:46:34.114717 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-p8np8"] Mar 20 14:46:35 crc kubenswrapper[4856]: I0320 14:46:35.028542 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerStarted","Data":"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899"} Mar 20 14:46:35 crc kubenswrapper[4856]: I0320 14:46:35.045635 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lvw7z" podStartSLOduration=2.374862051 podStartE2EDuration="5.045613072s" podCreationTimestamp="2026-03-20 14:46:30 +0000 UTC" firstStartedPulling="2026-03-20 14:46:31.992429713 +0000 UTC m=+5006.873455843" lastFinishedPulling="2026-03-20 14:46:34.663180734 +0000 UTC m=+5009.544206864" observedRunningTime="2026-03-20 14:46:35.043997338 +0000 UTC m=+5009.925023468" watchObservedRunningTime="2026-03-20 14:46:35.045613072 +0000 UTC m=+5009.926639202" Mar 20 14:46:35 crc kubenswrapper[4856]: I0320 14:46:35.828704 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" path="/var/lib/kubelet/pods/1718e84b-4ba3-4c92-b2bb-c9ab9987e62b/volumes" Mar 20 14:46:36 crc kubenswrapper[4856]: I0320 14:46:36.045589 4856 scope.go:117] "RemoveContainer" containerID="d2536b3ae04957af475866a559af964e55d63f8b98feb233d0b298276dd4e8ab" Mar 20 14:46:40 crc kubenswrapper[4856]: I0320 14:46:40.632527 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:40 crc kubenswrapper[4856]: I0320 14:46:40.633129 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:40 crc kubenswrapper[4856]: I0320 14:46:40.676094 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:41 crc kubenswrapper[4856]: I0320 14:46:41.109111 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:42 crc kubenswrapper[4856]: I0320 14:46:42.082664 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.093327 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lvw7z" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="registry-server" containerID="cri-o://5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899" gracePeriod=2 Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.484801 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.624531 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities\") pod \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.624586 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content\") pod \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.624667 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw\") pod \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\" (UID: \"485b6b7f-2013-47da-8ecb-2236bc8f72ae\") " Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.625695 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities" (OuterVolumeSpecName: "utilities") pod "485b6b7f-2013-47da-8ecb-2236bc8f72ae" (UID: "485b6b7f-2013-47da-8ecb-2236bc8f72ae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.630849 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw" (OuterVolumeSpecName: "kube-api-access-5h8kw") pod "485b6b7f-2013-47da-8ecb-2236bc8f72ae" (UID: "485b6b7f-2013-47da-8ecb-2236bc8f72ae"). InnerVolumeSpecName "kube-api-access-5h8kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.726370 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h8kw\" (UniqueName: \"kubernetes.io/projected/485b6b7f-2013-47da-8ecb-2236bc8f72ae-kube-api-access-5h8kw\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.726402 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.747353 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "485b6b7f-2013-47da-8ecb-2236bc8f72ae" (UID: "485b6b7f-2013-47da-8ecb-2236bc8f72ae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:46:43 crc kubenswrapper[4856]: I0320 14:46:43.828140 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/485b6b7f-2013-47da-8ecb-2236bc8f72ae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.106305 4856 generic.go:334] "Generic (PLEG): container finished" podID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerID="5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899" exitCode=0 Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.106358 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerDied","Data":"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899"} Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.106389 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lvw7z" event={"ID":"485b6b7f-2013-47da-8ecb-2236bc8f72ae","Type":"ContainerDied","Data":"c5ca41522125036b54e4eda5d929875182f17645eeae0d201e7e1cdeddb7f509"} Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.106409 4856 scope.go:117] "RemoveContainer" containerID="5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.106474 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lvw7z" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.131110 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.134368 4856 scope.go:117] "RemoveContainer" containerID="e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.138697 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lvw7z"] Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.151989 4856 scope.go:117] "RemoveContainer" containerID="60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.179344 4856 scope.go:117] "RemoveContainer" containerID="5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899" Mar 20 14:46:44 crc kubenswrapper[4856]: E0320 14:46:44.179874 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899\": container with ID starting with 5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899 not found: ID does not exist" containerID="5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.179943 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899"} err="failed to get container status \"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899\": rpc error: code = NotFound desc = could not find container \"5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899\": container with ID starting with 5c0ce9b32b13ee4c8c35342b4bf8bf7053b28ba85d53fa90075e022dc7863899 not found: ID does not exist" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.179974 4856 scope.go:117] "RemoveContainer" containerID="e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b" Mar 20 14:46:44 crc kubenswrapper[4856]: E0320 14:46:44.180467 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b\": container with ID starting with e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b not found: ID does not exist" containerID="e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.180501 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b"} err="failed to get container status \"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b\": rpc error: code = NotFound desc = could not find container \"e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b\": container with ID starting with e9ea24699d41a19c17886b5a0e95b2f2def195bbb3d024d4f3d0121fa50cb51b not found: ID does not exist" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.180522 4856 scope.go:117] "RemoveContainer" containerID="60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e" Mar 20 14:46:44 crc kubenswrapper[4856]: E0320 14:46:44.180803 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e\": container with ID starting with 60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e not found: ID does not exist" containerID="60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e" Mar 20 14:46:44 crc kubenswrapper[4856]: I0320 14:46:44.180920 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e"} err="failed to get container status \"60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e\": rpc error: code = NotFound desc = could not find container \"60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e\": container with ID starting with 60a244500c9970b330799f0fbb1598d9464f22a7a4f8050d1a684ca53dc1095e not found: ID does not exist" Mar 20 14:46:45 crc kubenswrapper[4856]: I0320 14:46:45.828874 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" path="/var/lib/kubelet/pods/485b6b7f-2013-47da-8ecb-2236bc8f72ae/volumes" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.149231 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fjr6f/must-gather-hkhnb"] Mar 20 14:47:12 crc kubenswrapper[4856]: E0320 14:47:12.150179 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="extract-content" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150198 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="extract-content" Mar 20 14:47:12 crc kubenswrapper[4856]: E0320 14:47:12.150214 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="registry-server" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150221 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="registry-server" Mar 20 14:47:12 crc kubenswrapper[4856]: E0320 14:47:12.150242 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="extract-utilities" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150250 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="extract-utilities" Mar 20 14:47:12 crc kubenswrapper[4856]: E0320 14:47:12.150268 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="init" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150276 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="init" Mar 20 14:47:12 crc kubenswrapper[4856]: E0320 14:47:12.150312 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="dnsmasq-dns" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150322 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="dnsmasq-dns" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150545 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="485b6b7f-2013-47da-8ecb-2236bc8f72ae" containerName="registry-server" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.150568 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="1718e84b-4ba3-4c92-b2bb-c9ab9987e62b" containerName="dnsmasq-dns" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.151338 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.153264 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fjr6f"/"openshift-service-ca.crt" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.153482 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fjr6f"/"kube-root-ca.crt" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.153514 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fjr6f"/"default-dockercfg-gs66v" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.156874 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fjr6f/must-gather-hkhnb"] Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.278758 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9s5p\" (UniqueName: \"kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.279210 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.380956 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.381005 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9s5p\" (UniqueName: \"kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.381506 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.402072 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9s5p\" (UniqueName: \"kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p\") pod \"must-gather-hkhnb\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.470906 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.902738 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fjr6f/must-gather-hkhnb"] Mar 20 14:47:12 crc kubenswrapper[4856]: I0320 14:47:12.904264 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:47:13 crc kubenswrapper[4856]: I0320 14:47:13.340809 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" event={"ID":"317a9dab-b70f-45db-afd0-51fc3430145f","Type":"ContainerStarted","Data":"0ffb11c017af65e79ad8974aa07754453b1e68b50c6603aaf865de62f88db03c"} Mar 20 14:47:17 crc kubenswrapper[4856]: I0320 14:47:17.373484 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" event={"ID":"317a9dab-b70f-45db-afd0-51fc3430145f","Type":"ContainerStarted","Data":"d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4"} Mar 20 14:47:17 crc kubenswrapper[4856]: I0320 14:47:17.374620 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" event={"ID":"317a9dab-b70f-45db-afd0-51fc3430145f","Type":"ContainerStarted","Data":"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14"} Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.141912 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" podStartSLOduration=44.598311844 podStartE2EDuration="48.141887915s" podCreationTimestamp="2026-03-20 14:47:12 +0000 UTC" firstStartedPulling="2026-03-20 14:47:12.904016106 +0000 UTC m=+5047.785042246" lastFinishedPulling="2026-03-20 14:47:16.447592187 +0000 UTC m=+5051.328618317" observedRunningTime="2026-03-20 14:47:17.395876689 +0000 UTC m=+5052.276902839" watchObservedRunningTime="2026-03-20 14:48:00.141887915 +0000 UTC m=+5095.022914045" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.147457 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566968-rllwc"] Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.148847 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.151227 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.151244 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.151509 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.158927 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-rllwc"] Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.292457 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnncn\" (UniqueName: \"kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn\") pod \"auto-csr-approver-29566968-rllwc\" (UID: \"019397bf-1f67-46eb-8176-b47b237770fe\") " pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.393994 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnncn\" (UniqueName: \"kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn\") pod \"auto-csr-approver-29566968-rllwc\" (UID: \"019397bf-1f67-46eb-8176-b47b237770fe\") " pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.412131 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnncn\" (UniqueName: \"kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn\") pod \"auto-csr-approver-29566968-rllwc\" (UID: \"019397bf-1f67-46eb-8176-b47b237770fe\") " pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.478857 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:00 crc kubenswrapper[4856]: I0320 14:48:00.942081 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-rllwc"] Mar 20 14:48:01 crc kubenswrapper[4856]: I0320 14:48:01.680799 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-rllwc" event={"ID":"019397bf-1f67-46eb-8176-b47b237770fe","Type":"ContainerStarted","Data":"5cae6fbfb5c6102a324799c4419acf1a6c751871b7eb495475a0fdbc7eeb4a30"} Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.010519 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b7946d7b9-l7lkj_33844cc5-c637-4214-b684-af1d3a8228fc/init/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.195988 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b7946d7b9-l7lkj_33844cc5-c637-4214-b684-af1d3a8228fc/init/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.247355 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5b7946d7b9-l7lkj_33844cc5-c637-4214-b684-af1d3a8228fc/dnsmasq-dns/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.399822 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_4b1216c9-a548-44fa-bf11-2e689bd7c575/memcached/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.541974 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4de8ddb-91b7-44f4-a98f-aba9d1f2527d/mysql-bootstrap/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.756978 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4de8ddb-91b7-44f4-a98f-aba9d1f2527d/mysql-bootstrap/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.765875 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_a4de8ddb-91b7-44f4-a98f-aba9d1f2527d/galera/0.log" Mar 20 14:48:02 crc kubenswrapper[4856]: I0320 14:48:02.766758 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_700a6806-8beb-4f1b-8462-dd888f16714d/mysql-bootstrap/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.022225 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_700a6806-8beb-4f1b-8462-dd888f16714d/galera/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.094638 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_700a6806-8beb-4f1b-8462-dd888f16714d/mysql-bootstrap/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.167921 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b16d74bd-8cbc-4a22-a06d-a5b7c15859ab/setup-container/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.238721 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b16d74bd-8cbc-4a22-a06d-a5b7c15859ab/setup-container/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.291037 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b16d74bd-8cbc-4a22-a06d-a5b7c15859ab/rabbitmq/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.389211 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e716ae39-1ad1-47c2-ac59-04b100421073/setup-container/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.536766 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e716ae39-1ad1-47c2-ac59-04b100421073/rabbitmq/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.540417 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_e716ae39-1ad1-47c2-ac59-04b100421073/setup-container/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.616136 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-52c57_284c63cf-8405-466d-88c5-97ededc321bc/mariadb-account-create-update/0.log" Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.697754 4856 generic.go:334] "Generic (PLEG): container finished" podID="019397bf-1f67-46eb-8176-b47b237770fe" containerID="4774dd37ee4ef80f2a3b790a74d2e2ee5dede93ebb18e31683129a42ebc53b52" exitCode=0 Mar 20 14:48:03 crc kubenswrapper[4856]: I0320 14:48:03.697795 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-rllwc" event={"ID":"019397bf-1f67-46eb-8176-b47b237770fe","Type":"ContainerDied","Data":"4774dd37ee4ef80f2a3b790a74d2e2ee5dede93ebb18e31683129a42ebc53b52"} Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.023372 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.162589 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnncn\" (UniqueName: \"kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn\") pod \"019397bf-1f67-46eb-8176-b47b237770fe\" (UID: \"019397bf-1f67-46eb-8176-b47b237770fe\") " Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.167207 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn" (OuterVolumeSpecName: "kube-api-access-jnncn") pod "019397bf-1f67-46eb-8176-b47b237770fe" (UID: "019397bf-1f67-46eb-8176-b47b237770fe"). InnerVolumeSpecName "kube-api-access-jnncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.264347 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnncn\" (UniqueName: \"kubernetes.io/projected/019397bf-1f67-46eb-8176-b47b237770fe-kube-api-access-jnncn\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.714652 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-rllwc" event={"ID":"019397bf-1f67-46eb-8176-b47b237770fe","Type":"ContainerDied","Data":"5cae6fbfb5c6102a324799c4419acf1a6c751871b7eb495475a0fdbc7eeb4a30"} Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.714700 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-rllwc" Mar 20 14:48:05 crc kubenswrapper[4856]: I0320 14:48:05.714707 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cae6fbfb5c6102a324799c4419acf1a6c751871b7eb495475a0fdbc7eeb4a30" Mar 20 14:48:06 crc kubenswrapper[4856]: I0320 14:48:06.083471 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-l7pxc"] Mar 20 14:48:06 crc kubenswrapper[4856]: I0320 14:48:06.089221 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-l7pxc"] Mar 20 14:48:07 crc kubenswrapper[4856]: I0320 14:48:07.827427 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a681a7b2-6f2b-4a71-b035-5fb78e206899" path="/var/lib/kubelet/pods/a681a7b2-6f2b-4a71-b035-5fb78e206899/volumes" Mar 20 14:48:09 crc kubenswrapper[4856]: I0320 14:48:09.987755 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:48:09 crc kubenswrapper[4856]: I0320 14:48:09.988071 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.892519 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:11 crc kubenswrapper[4856]: E0320 14:48:11.893233 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="019397bf-1f67-46eb-8176-b47b237770fe" containerName="oc" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.893253 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="019397bf-1f67-46eb-8176-b47b237770fe" containerName="oc" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.893482 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="019397bf-1f67-46eb-8176-b47b237770fe" containerName="oc" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.894773 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.915834 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.964235 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.964314 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:11 crc kubenswrapper[4856]: I0320 14:48:11.964446 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jk2z\" (UniqueName: \"kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.066958 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.067112 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.067229 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jk2z\" (UniqueName: \"kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.067683 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.067732 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.093219 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jk2z\" (UniqueName: \"kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z\") pod \"community-operators-v8vpz\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.215318 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.728602 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:12 crc kubenswrapper[4856]: I0320 14:48:12.778470 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerStarted","Data":"c9d0993b6e79ea07e85e6cee5cce74525937a60aaa36cc7e9bc5a8aa8d51a128"} Mar 20 14:48:13 crc kubenswrapper[4856]: I0320 14:48:13.790196 4856 generic.go:334] "Generic (PLEG): container finished" podID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerID="fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5" exitCode=0 Mar 20 14:48:13 crc kubenswrapper[4856]: I0320 14:48:13.790233 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerDied","Data":"fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5"} Mar 20 14:48:14 crc kubenswrapper[4856]: I0320 14:48:14.799104 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerStarted","Data":"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4"} Mar 20 14:48:15 crc kubenswrapper[4856]: I0320 14:48:15.807378 4856 generic.go:334] "Generic (PLEG): container finished" podID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerID="b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4" exitCode=0 Mar 20 14:48:15 crc kubenswrapper[4856]: I0320 14:48:15.807425 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerDied","Data":"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4"} Mar 20 14:48:16 crc kubenswrapper[4856]: I0320 14:48:16.816793 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerStarted","Data":"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3"} Mar 20 14:48:16 crc kubenswrapper[4856]: I0320 14:48:16.840651 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v8vpz" podStartSLOduration=3.224183867 podStartE2EDuration="5.840627275s" podCreationTimestamp="2026-03-20 14:48:11 +0000 UTC" firstStartedPulling="2026-03-20 14:48:13.791870577 +0000 UTC m=+5108.672896717" lastFinishedPulling="2026-03-20 14:48:16.408313955 +0000 UTC m=+5111.289340125" observedRunningTime="2026-03-20 14:48:16.837084818 +0000 UTC m=+5111.718110948" watchObservedRunningTime="2026-03-20 14:48:16.840627275 +0000 UTC m=+5111.721653415" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.212979 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/util/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.336401 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/util/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.373936 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/pull/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.406883 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/pull/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.567312 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/pull/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.609660 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/extract/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.614176 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fdce41130f5b29849949eff9ffde1b21c32ee084e9de87dcb7c2c7c8474bhr_cde5a69b-f5f6-401d-88c4-81ad127e860f/util/0.log" Mar 20 14:48:19 crc kubenswrapper[4856]: I0320 14:48:19.891147 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-h2vzr_c22916c3-cf42-4583-8529-2f42a5780500/manager/0.log" Mar 20 14:48:20 crc kubenswrapper[4856]: I0320 14:48:20.279960 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-4qkfk_602b2383-2c80-49b5-afa6-400c6022f0d6/manager/0.log" Mar 20 14:48:20 crc kubenswrapper[4856]: I0320 14:48:20.504664 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-7xj74_43236c8a-2018-4001-a8dc-67a9d4488f0a/manager/0.log" Mar 20 14:48:20 crc kubenswrapper[4856]: I0320 14:48:20.595620 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-7bv99_60481295-8929-4cff-88c0-fc9645c555e6/manager/0.log" Mar 20 14:48:20 crc kubenswrapper[4856]: I0320 14:48:20.769241 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-rf4db_803de023-bc1c-42f0-899b-b7053081db3b/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.084912 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-7zxs6_4ca1fc8c-012a-4067-8ca1-ae2424a66b65/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.267181 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-n45pl_fee7d83a-7c59-4a95-85b3-8f677f068731/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.430581 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-v698k_122f071b-3f1d-4364-8142-466caeb29677/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.443729 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-hwz76_16329126-6028-435b-b961-b483af84efc2/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.611600 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-pv5l7_ef1eeee2-e51e-4771-934c-a4b0c9e4d949/manager/0.log" Mar 20 14:48:21 crc kubenswrapper[4856]: I0320 14:48:21.680007 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-ckd6w_cfd501fb-ec8a-4b56-840f-975ed1184cd3/manager/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.081496 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-95dk8_ef7a6885-15ab-47ac-911f-5ef35b971f7f/manager/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.213185 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-r6gkh_eb68c239-2237-493b-8943-597ad3822379/manager/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.215880 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.215945 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.287341 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.295839 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-58gr9_74388a48-2f88-4093-a143-628de32ad98c/manager/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.471650 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f5qfxfb_32083d25-90e1-4571-959b-629f6d8393a5/manager/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.622519 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-59b5998766-f2xcf_b64623b8-5be4-4269-891b-bb0154ab18b3/operator/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.907122 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.909944 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-95qtd_b2beaa10-d6ab-4f70-a62c-48d441fd3e8b/registry-server/0.log" Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.966032 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:22 crc kubenswrapper[4856]: I0320 14:48:22.986145 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-fr4gw_52b4ae24-4743-4bea-aac3-a6a2fd4b1990/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.129748 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-s2jkg_f62801cd-9d41-4312-a337-2e39d0bb1997/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.280773 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-xv6h2_b3055bde-69b7-478d-8ddf-bb187b58e23e/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.475000 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-2b2z6_625deb2f-8031-4fbd-93da-c6cfb29b5d9f/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.582893 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85d5885774-8ws6b_78560b1b-78fa-4282-a6c3-a06306ab470c/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.611120 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-nwp96_f6c2630c-bcb8-45a6-96ee-1cbe64b472ea/manager/0.log" Mar 20 14:48:23 crc kubenswrapper[4856]: I0320 14:48:23.722759 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-w6tnk_e75213b1-7eab-451f-bca1-4f38db805ba7/manager/0.log" Mar 20 14:48:24 crc kubenswrapper[4856]: I0320 14:48:24.872617 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v8vpz" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="registry-server" containerID="cri-o://7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3" gracePeriod=2 Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.661733 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.777971 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content\") pod \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.778109 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities\") pod \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.778182 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jk2z\" (UniqueName: \"kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z\") pod \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\" (UID: \"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa\") " Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.779934 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities" (OuterVolumeSpecName: "utilities") pod "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" (UID: "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.786729 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z" (OuterVolumeSpecName: "kube-api-access-2jk2z") pod "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" (UID: "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa"). InnerVolumeSpecName "kube-api-access-2jk2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.838851 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" (UID: "fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.880476 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.880511 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.880525 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jk2z\" (UniqueName: \"kubernetes.io/projected/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa-kube-api-access-2jk2z\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.889158 4856 generic.go:334] "Generic (PLEG): container finished" podID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerID="7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3" exitCode=0 Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.889230 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v8vpz" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.889252 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerDied","Data":"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3"} Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.889446 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v8vpz" event={"ID":"fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa","Type":"ContainerDied","Data":"c9d0993b6e79ea07e85e6cee5cce74525937a60aaa36cc7e9bc5a8aa8d51a128"} Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.889445 4856 scope.go:117] "RemoveContainer" containerID="7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.921653 4856 scope.go:117] "RemoveContainer" containerID="b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.929803 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.938346 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v8vpz"] Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.962419 4856 scope.go:117] "RemoveContainer" containerID="fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.995338 4856 scope.go:117] "RemoveContainer" containerID="7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3" Mar 20 14:48:26 crc kubenswrapper[4856]: E0320 14:48:26.995853 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3\": container with ID starting with 7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3 not found: ID does not exist" containerID="7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.995886 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3"} err="failed to get container status \"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3\": rpc error: code = NotFound desc = could not find container \"7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3\": container with ID starting with 7357011426f54fc0dd859c57499a22d730c1373501dcebbcc6af23b4c51b40f3 not found: ID does not exist" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.995911 4856 scope.go:117] "RemoveContainer" containerID="b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4" Mar 20 14:48:26 crc kubenswrapper[4856]: E0320 14:48:26.996214 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4\": container with ID starting with b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4 not found: ID does not exist" containerID="b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.996239 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4"} err="failed to get container status \"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4\": rpc error: code = NotFound desc = could not find container \"b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4\": container with ID starting with b2ed2f2fd343ff223d214c1ff0fe6e9448b42cced723e8fccc5fa7d6717239e4 not found: ID does not exist" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.996257 4856 scope.go:117] "RemoveContainer" containerID="fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5" Mar 20 14:48:26 crc kubenswrapper[4856]: E0320 14:48:26.996531 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5\": container with ID starting with fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5 not found: ID does not exist" containerID="fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5" Mar 20 14:48:26 crc kubenswrapper[4856]: I0320 14:48:26.996570 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5"} err="failed to get container status \"fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5\": rpc error: code = NotFound desc = could not find container \"fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5\": container with ID starting with fde72bf8575d1e477bdfac424d7a9d227a6f1c7f040fadf6388fa2a9ffd802f5 not found: ID does not exist" Mar 20 14:48:27 crc kubenswrapper[4856]: I0320 14:48:27.833679 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" path="/var/lib/kubelet/pods/fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa/volumes" Mar 20 14:48:36 crc kubenswrapper[4856]: I0320 14:48:36.161764 4856 scope.go:117] "RemoveContainer" containerID="5df7da3b0674ae17f8322d726c908dc9845e14e6ce2dbb7a9710f477d78967b2" Mar 20 14:48:36 crc kubenswrapper[4856]: I0320 14:48:36.183316 4856 scope.go:117] "RemoveContainer" containerID="7762a226700abce90a407e2c9dc2415e34aa87b269c663a8dec2e5d3a6500e18" Mar 20 14:48:39 crc kubenswrapper[4856]: I0320 14:48:39.987457 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:48:39 crc kubenswrapper[4856]: I0320 14:48:39.988090 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:48:42 crc kubenswrapper[4856]: I0320 14:48:42.019062 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-7dbpg_fa44c47f-e650-4056-9588-51fd98a96b99/control-plane-machine-set-operator/0.log" Mar 20 14:48:42 crc kubenswrapper[4856]: I0320 14:48:42.201222 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvkxl_36be461e-6bde-44a4-8cbe-35a5c8ef8be8/kube-rbac-proxy/0.log" Mar 20 14:48:42 crc kubenswrapper[4856]: I0320 14:48:42.225803 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cvkxl_36be461e-6bde-44a4-8cbe-35a5c8ef8be8/machine-api-operator/0.log" Mar 20 14:48:53 crc kubenswrapper[4856]: I0320 14:48:53.711502 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-7gbpp_4ba16d0c-f526-4e60-b685-e4d8b51766af/cert-manager-controller/0.log" Mar 20 14:48:53 crc kubenswrapper[4856]: I0320 14:48:53.868377 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-r7hgh_e187cb86-add1-4f10-b59f-cf21b0927b2b/cert-manager-cainjector/0.log" Mar 20 14:48:53 crc kubenswrapper[4856]: I0320 14:48:53.915054 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-bs4xl_775f75c4-f69a-4f8f-8946-ef4fb8909ea4/cert-manager-webhook/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.515204 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-bvbkn_f42479ac-4c93-4072-a6e1-3055d25b5dfd/nmstate-console-plugin/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.712758 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-kc6cb_59b5f4cf-9b54-47d0-b4e3-bb4f10fe2870/nmstate-handler/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.769747 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-vc7rs_dd1541ea-2421-4a7c-b572-c77955f3f748/kube-rbac-proxy/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.778748 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-vc7rs_dd1541ea-2421-4a7c-b572-c77955f3f748/nmstate-metrics/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.977012 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-s6jj2_69ed1b45-5fd1-4011-b0b2-11e0cdc73cc9/nmstate-operator/0.log" Mar 20 14:49:05 crc kubenswrapper[4856]: I0320 14:49:05.989861 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-2cppz_57f6e9d7-6cfd-47f8-99a5-7e2dc054bbd1/nmstate-webhook/0.log" Mar 20 14:49:09 crc kubenswrapper[4856]: I0320 14:49:09.988086 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:49:09 crc kubenswrapper[4856]: I0320 14:49:09.988743 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:49:09 crc kubenswrapper[4856]: I0320 14:49:09.988809 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:49:09 crc kubenswrapper[4856]: I0320 14:49:09.989618 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:49:09 crc kubenswrapper[4856]: I0320 14:49:09.989687 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e" gracePeriod=600 Mar 20 14:49:11 crc kubenswrapper[4856]: I0320 14:49:11.217014 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e" exitCode=0 Mar 20 14:49:11 crc kubenswrapper[4856]: I0320 14:49:11.217177 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e"} Mar 20 14:49:11 crc kubenswrapper[4856]: I0320 14:49:11.217442 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerStarted","Data":"daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c"} Mar 20 14:49:11 crc kubenswrapper[4856]: I0320 14:49:11.217474 4856 scope.go:117] "RemoveContainer" containerID="042108d8ecfea7fce26918890698eebc0be0f3ebeba1d379c11c6cd92b423eaf" Mar 20 14:49:31 crc kubenswrapper[4856]: I0320 14:49:31.765254 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hc6d5_c947915a-318f-480b-b4cb-96024bb62eb3/kube-rbac-proxy/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.005986 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-frr-files/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.021184 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hc6d5_c947915a-318f-480b-b4cb-96024bb62eb3/controller/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.190867 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-metrics/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.213673 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-frr-files/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.222379 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-reloader/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.276209 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-reloader/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.441447 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-metrics/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.450232 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-reloader/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.454058 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-frr-files/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.461711 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-metrics/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.648044 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-frr-files/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.691928 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-metrics/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.691928 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/cp-reloader/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.733822 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/controller/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.895929 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/kube-rbac-proxy/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.912744 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/frr-metrics/0.log" Mar 20 14:49:32 crc kubenswrapper[4856]: I0320 14:49:32.953081 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/kube-rbac-proxy-frr/0.log" Mar 20 14:49:33 crc kubenswrapper[4856]: I0320 14:49:33.152370 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/reloader/0.log" Mar 20 14:49:33 crc kubenswrapper[4856]: I0320 14:49:33.200226 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-s4f2d_7d087c80-beb6-4bdd-b00a-248658c0378c/frr-k8s-webhook-server/0.log" Mar 20 14:49:33 crc kubenswrapper[4856]: I0320 14:49:33.442624 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5cbf85c554-nqz8j_c40497b5-5353-4d09-b108-88f673dc8f13/manager/0.log" Mar 20 14:49:33 crc kubenswrapper[4856]: I0320 14:49:33.567672 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7c5d84d5d5-qg7c5_97ab1eaf-cfd7-47b3-afc8-c8327065108c/webhook-server/0.log" Mar 20 14:49:33 crc kubenswrapper[4856]: I0320 14:49:33.651211 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjqm4_c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e/kube-rbac-proxy/0.log" Mar 20 14:49:34 crc kubenswrapper[4856]: I0320 14:49:34.230591 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-gjqm4_c0c6db6f-8ff3-4d0e-9172-5d9a7ed83a8e/speaker/0.log" Mar 20 14:49:34 crc kubenswrapper[4856]: I0320 14:49:34.705461 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-stt59_75f4513d-63d1-4433-b319-038b189e4be5/frr/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.318837 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/util/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.505293 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/util/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.506886 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/pull/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.559465 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/pull/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.694811 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/util/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.718361 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/pull/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.720367 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874pvd7v_69075990-e502-4fda-b16b-ec1247bcb9a2/extract/0.log" Mar 20 14:49:46 crc kubenswrapper[4856]: I0320 14:49:46.897673 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.075899 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.109857 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/pull/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.139720 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/pull/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.340623 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/pull/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.340962 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/extract/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.357241 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c12hc2v_473deb71-de01-4290-a70d-f21794be8f0e/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.525846 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.658574 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/pull/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.659303 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.680401 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/pull/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.854223 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/extract/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.861347 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/util/0.log" Mar 20 14:49:47 crc kubenswrapper[4856]: I0320 14:49:47.877383 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e54nr7b_0756beee-1cb6-48fe-8910-fb87333a83d8/pull/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.065414 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-utilities/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.250631 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-content/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.267580 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-content/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.275784 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-utilities/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.524043 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-utilities/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.529534 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/extract-content/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.696473 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-utilities/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.922238 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-content/0.log" Mar 20 14:49:48 crc kubenswrapper[4856]: I0320 14:49:48.945375 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-utilities/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.004394 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5xl9m_3e12430a-718f-4734-bfa0-1e4fd5d46b38/registry-server/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.004874 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-content/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.208487 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-content/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.208620 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/extract-utilities/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.483405 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n4z5q_e3871fbb-6e58-45b2-a475-a45fa18a090d/marketplace-operator/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.596183 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-utilities/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.774167 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-content/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.810962 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-utilities/0.log" Mar 20 14:49:49 crc kubenswrapper[4856]: I0320 14:49:49.864332 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-content/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.033416 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-content/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.095416 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/extract-utilities/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.132727 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-z6827_d77a2053-0327-4a08-a50a-89945990633c/registry-server/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.327913 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-utilities/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.334013 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-hkp5x_c2207a1a-f126-4cb6-841e-a7904a74a7d9/registry-server/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.461264 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-content/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.495931 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-content/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.527250 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-utilities/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.705171 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-utilities/0.log" Mar 20 14:49:50 crc kubenswrapper[4856]: I0320 14:49:50.725976 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/extract-content/0.log" Mar 20 14:49:51 crc kubenswrapper[4856]: I0320 14:49:51.443854 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-dn229_028ade1f-56fe-45a4-a7ce-6d3d62e38657/registry-server/0.log" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.151843 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566970-nzkdp"] Mar 20 14:50:00 crc kubenswrapper[4856]: E0320 14:50:00.152963 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.152982 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4856]: E0320 14:50:00.152997 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.153009 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4856]: E0320 14:50:00.153023 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.153034 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.153288 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc89b5bd-1ea0-4b6c-93fc-8da2d3f90caa" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.154010 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.158159 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.158283 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.163125 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.164755 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-nzkdp"] Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.287596 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmr6v\" (UniqueName: \"kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v\") pod \"auto-csr-approver-29566970-nzkdp\" (UID: \"0098a499-d0a3-43a3-8046-e6bb63a482e0\") " pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.388966 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmr6v\" (UniqueName: \"kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v\") pod \"auto-csr-approver-29566970-nzkdp\" (UID: \"0098a499-d0a3-43a3-8046-e6bb63a482e0\") " pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.407535 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmr6v\" (UniqueName: \"kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v\") pod \"auto-csr-approver-29566970-nzkdp\" (UID: \"0098a499-d0a3-43a3-8046-e6bb63a482e0\") " pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.477492 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:00 crc kubenswrapper[4856]: I0320 14:50:00.913679 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-nzkdp"] Mar 20 14:50:01 crc kubenswrapper[4856]: I0320 14:50:01.569121 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" event={"ID":"0098a499-d0a3-43a3-8046-e6bb63a482e0","Type":"ContainerStarted","Data":"62c46b4af80dcf3c77dbdd989ba1975b78df1f8e65fe128083310c5b8bbc9c9e"} Mar 20 14:50:03 crc kubenswrapper[4856]: I0320 14:50:03.589950 4856 generic.go:334] "Generic (PLEG): container finished" podID="0098a499-d0a3-43a3-8046-e6bb63a482e0" containerID="e68432c07be8a58afad9c49801a1258215e27438d45df8bb9b9f07bf75f4b2dd" exitCode=0 Mar 20 14:50:03 crc kubenswrapper[4856]: I0320 14:50:03.590247 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" event={"ID":"0098a499-d0a3-43a3-8046-e6bb63a482e0","Type":"ContainerDied","Data":"e68432c07be8a58afad9c49801a1258215e27438d45df8bb9b9f07bf75f4b2dd"} Mar 20 14:50:04 crc kubenswrapper[4856]: I0320 14:50:04.922448 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.055903 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmr6v\" (UniqueName: \"kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v\") pod \"0098a499-d0a3-43a3-8046-e6bb63a482e0\" (UID: \"0098a499-d0a3-43a3-8046-e6bb63a482e0\") " Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.060191 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v" (OuterVolumeSpecName: "kube-api-access-nmr6v") pod "0098a499-d0a3-43a3-8046-e6bb63a482e0" (UID: "0098a499-d0a3-43a3-8046-e6bb63a482e0"). InnerVolumeSpecName "kube-api-access-nmr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.157366 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmr6v\" (UniqueName: \"kubernetes.io/projected/0098a499-d0a3-43a3-8046-e6bb63a482e0-kube-api-access-nmr6v\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.601736 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" event={"ID":"0098a499-d0a3-43a3-8046-e6bb63a482e0","Type":"ContainerDied","Data":"62c46b4af80dcf3c77dbdd989ba1975b78df1f8e65fe128083310c5b8bbc9c9e"} Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.601985 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62c46b4af80dcf3c77dbdd989ba1975b78df1f8e65fe128083310c5b8bbc9c9e" Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.602034 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-nzkdp" Mar 20 14:50:05 crc kubenswrapper[4856]: I0320 14:50:05.995692 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-wxcp9"] Mar 20 14:50:06 crc kubenswrapper[4856]: I0320 14:50:06.004934 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-wxcp9"] Mar 20 14:50:07 crc kubenswrapper[4856]: I0320 14:50:07.827399 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66ce2de7-9712-4031-b5c8-edbfe5c775cf" path="/var/lib/kubelet/pods/66ce2de7-9712-4031-b5c8-edbfe5c775cf/volumes" Mar 20 14:50:36 crc kubenswrapper[4856]: I0320 14:50:36.306792 4856 scope.go:117] "RemoveContainer" containerID="6d44629c872b7b54ea08f37404c1e282ada32c159a4d01ae8818a9aa08200882" Mar 20 14:51:18 crc kubenswrapper[4856]: I0320 14:51:18.143792 4856 generic.go:334] "Generic (PLEG): container finished" podID="317a9dab-b70f-45db-afd0-51fc3430145f" containerID="bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14" exitCode=0 Mar 20 14:51:18 crc kubenswrapper[4856]: I0320 14:51:18.143875 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" event={"ID":"317a9dab-b70f-45db-afd0-51fc3430145f","Type":"ContainerDied","Data":"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14"} Mar 20 14:51:18 crc kubenswrapper[4856]: I0320 14:51:18.144776 4856 scope.go:117] "RemoveContainer" containerID="bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14" Mar 20 14:51:18 crc kubenswrapper[4856]: I0320 14:51:18.902856 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjr6f_must-gather-hkhnb_317a9dab-b70f-45db-afd0-51fc3430145f/gather/0.log" Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.475346 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fjr6f/must-gather-hkhnb"] Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.476320 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="copy" containerID="cri-o://d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4" gracePeriod=2 Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.483395 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fjr6f/must-gather-hkhnb"] Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.867652 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjr6f_must-gather-hkhnb_317a9dab-b70f-45db-afd0-51fc3430145f/copy/0.log" Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.868451 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.997524 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output\") pod \"317a9dab-b70f-45db-afd0-51fc3430145f\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " Mar 20 14:51:26 crc kubenswrapper[4856]: I0320 14:51:26.997687 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9s5p\" (UniqueName: \"kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p\") pod \"317a9dab-b70f-45db-afd0-51fc3430145f\" (UID: \"317a9dab-b70f-45db-afd0-51fc3430145f\") " Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.004735 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p" (OuterVolumeSpecName: "kube-api-access-m9s5p") pod "317a9dab-b70f-45db-afd0-51fc3430145f" (UID: "317a9dab-b70f-45db-afd0-51fc3430145f"). InnerVolumeSpecName "kube-api-access-m9s5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.099306 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9s5p\" (UniqueName: \"kubernetes.io/projected/317a9dab-b70f-45db-afd0-51fc3430145f-kube-api-access-m9s5p\") on node \"crc\" DevicePath \"\"" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.129042 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "317a9dab-b70f-45db-afd0-51fc3430145f" (UID: "317a9dab-b70f-45db-afd0-51fc3430145f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.202025 4856 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/317a9dab-b70f-45db-afd0-51fc3430145f-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.223107 4856 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fjr6f_must-gather-hkhnb_317a9dab-b70f-45db-afd0-51fc3430145f/copy/0.log" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.223471 4856 generic.go:334] "Generic (PLEG): container finished" podID="317a9dab-b70f-45db-afd0-51fc3430145f" containerID="d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4" exitCode=143 Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.223522 4856 scope.go:117] "RemoveContainer" containerID="d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.223638 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fjr6f/must-gather-hkhnb" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.251859 4856 scope.go:117] "RemoveContainer" containerID="bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.396500 4856 scope.go:117] "RemoveContainer" containerID="d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4" Mar 20 14:51:27 crc kubenswrapper[4856]: E0320 14:51:27.403411 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4\": container with ID starting with d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4 not found: ID does not exist" containerID="d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.403459 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4"} err="failed to get container status \"d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4\": rpc error: code = NotFound desc = could not find container \"d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4\": container with ID starting with d4b7ef22ba1d99d680d4b13018d8ddb7e64072eaba9d7a2bff8dd5092498d3d4 not found: ID does not exist" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.403482 4856 scope.go:117] "RemoveContainer" containerID="bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14" Mar 20 14:51:27 crc kubenswrapper[4856]: E0320 14:51:27.407377 4856 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14\": container with ID starting with bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14 not found: ID does not exist" containerID="bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.407411 4856 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14"} err="failed to get container status \"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14\": rpc error: code = NotFound desc = could not find container \"bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14\": container with ID starting with bd8a30c0b1d50a269035b877824b17800e27b29f1133b3b6d1f210d8c0fa1e14 not found: ID does not exist" Mar 20 14:51:27 crc kubenswrapper[4856]: I0320 14:51:27.831057 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" path="/var/lib/kubelet/pods/317a9dab-b70f-45db-afd0-51fc3430145f/volumes" Mar 20 14:51:39 crc kubenswrapper[4856]: I0320 14:51:39.988323 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:51:39 crc kubenswrapper[4856]: I0320 14:51:39.988907 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.151491 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566972-v5fm5"] Mar 20 14:52:00 crc kubenswrapper[4856]: E0320 14:52:00.152860 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0098a499-d0a3-43a3-8046-e6bb63a482e0" containerName="oc" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.152875 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="0098a499-d0a3-43a3-8046-e6bb63a482e0" containerName="oc" Mar 20 14:52:00 crc kubenswrapper[4856]: E0320 14:52:00.152905 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="copy" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.152912 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="copy" Mar 20 14:52:00 crc kubenswrapper[4856]: E0320 14:52:00.152930 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="gather" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.152937 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="gather" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.153117 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="gather" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.153134 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="317a9dab-b70f-45db-afd0-51fc3430145f" containerName="copy" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.153143 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="0098a499-d0a3-43a3-8046-e6bb63a482e0" containerName="oc" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.153893 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.156646 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.156692 4856 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-mktsp" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.157940 4856 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.162364 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-v5fm5"] Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.311010 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq6nq\" (UniqueName: \"kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq\") pod \"auto-csr-approver-29566972-v5fm5\" (UID: \"93e1d03a-4c8c-49c6-a5e8-42711ca713e3\") " pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.412316 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq6nq\" (UniqueName: \"kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq\") pod \"auto-csr-approver-29566972-v5fm5\" (UID: \"93e1d03a-4c8c-49c6-a5e8-42711ca713e3\") " pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.432659 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq6nq\" (UniqueName: \"kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq\") pod \"auto-csr-approver-29566972-v5fm5\" (UID: \"93e1d03a-4c8c-49c6-a5e8-42711ca713e3\") " pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.475525 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:00 crc kubenswrapper[4856]: I0320 14:52:00.936246 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-v5fm5"] Mar 20 14:52:01 crc kubenswrapper[4856]: I0320 14:52:01.483934 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" event={"ID":"93e1d03a-4c8c-49c6-a5e8-42711ca713e3","Type":"ContainerStarted","Data":"bcc096f9aee06abc2bd10562370c29cb346395b16e880e6ef8bf42efd34cf73d"} Mar 20 14:52:02 crc kubenswrapper[4856]: I0320 14:52:02.492934 4856 generic.go:334] "Generic (PLEG): container finished" podID="93e1d03a-4c8c-49c6-a5e8-42711ca713e3" containerID="453ede61d01160920f3c930d39754df5f0bf7cb85b9b188c4a9b0549c692b712" exitCode=0 Mar 20 14:52:02 crc kubenswrapper[4856]: I0320 14:52:02.492977 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" event={"ID":"93e1d03a-4c8c-49c6-a5e8-42711ca713e3","Type":"ContainerDied","Data":"453ede61d01160920f3c930d39754df5f0bf7cb85b9b188c4a9b0549c692b712"} Mar 20 14:52:03 crc kubenswrapper[4856]: I0320 14:52:03.777102 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:03 crc kubenswrapper[4856]: I0320 14:52:03.865821 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq6nq\" (UniqueName: \"kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq\") pod \"93e1d03a-4c8c-49c6-a5e8-42711ca713e3\" (UID: \"93e1d03a-4c8c-49c6-a5e8-42711ca713e3\") " Mar 20 14:52:03 crc kubenswrapper[4856]: I0320 14:52:03.877560 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq" (OuterVolumeSpecName: "kube-api-access-tq6nq") pod "93e1d03a-4c8c-49c6-a5e8-42711ca713e3" (UID: "93e1d03a-4c8c-49c6-a5e8-42711ca713e3"). InnerVolumeSpecName "kube-api-access-tq6nq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:03 crc kubenswrapper[4856]: I0320 14:52:03.967543 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq6nq\" (UniqueName: \"kubernetes.io/projected/93e1d03a-4c8c-49c6-a5e8-42711ca713e3-kube-api-access-tq6nq\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:04 crc kubenswrapper[4856]: I0320 14:52:04.508256 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" event={"ID":"93e1d03a-4c8c-49c6-a5e8-42711ca713e3","Type":"ContainerDied","Data":"bcc096f9aee06abc2bd10562370c29cb346395b16e880e6ef8bf42efd34cf73d"} Mar 20 14:52:04 crc kubenswrapper[4856]: I0320 14:52:04.508317 4856 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc096f9aee06abc2bd10562370c29cb346395b16e880e6ef8bf42efd34cf73d" Mar 20 14:52:04 crc kubenswrapper[4856]: I0320 14:52:04.508698 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-v5fm5" Mar 20 14:52:04 crc kubenswrapper[4856]: I0320 14:52:04.859151 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-rt5td"] Mar 20 14:52:04 crc kubenswrapper[4856]: I0320 14:52:04.866534 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-rt5td"] Mar 20 14:52:05 crc kubenswrapper[4856]: I0320 14:52:05.827588 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c57cc40d-649a-4dff-8eeb-1483c7a72bf4" path="/var/lib/kubelet/pods/c57cc40d-649a-4dff-8eeb-1483c7a72bf4/volumes" Mar 20 14:52:09 crc kubenswrapper[4856]: I0320 14:52:09.987175 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:52:09 crc kubenswrapper[4856]: I0320 14:52:09.987711 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:52:36 crc kubenswrapper[4856]: I0320 14:52:36.406734 4856 scope.go:117] "RemoveContainer" containerID="ea5651f9695f7595779da1e181e5134bef63fe5eb571fdc510c6d9e42ba33052" Mar 20 14:52:36 crc kubenswrapper[4856]: I0320 14:52:36.458616 4856 scope.go:117] "RemoveContainer" containerID="851a84dfe498dc85cf933494515139ec3053841416f8b8115be21e7c4e64d870" Mar 20 14:52:39 crc kubenswrapper[4856]: I0320 14:52:39.987787 4856 patch_prober.go:28] interesting pod/machine-config-daemon-dhzh4 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:52:39 crc kubenswrapper[4856]: I0320 14:52:39.988336 4856 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:52:39 crc kubenswrapper[4856]: I0320 14:52:39.988381 4856 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" Mar 20 14:52:39 crc kubenswrapper[4856]: I0320 14:52:39.989011 4856 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c"} pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:52:39 crc kubenswrapper[4856]: I0320 14:52:39.989073 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerName="machine-config-daemon" containerID="cri-o://daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" gracePeriod=600 Mar 20 14:52:40 crc kubenswrapper[4856]: E0320 14:52:40.113617 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:52:40 crc kubenswrapper[4856]: I0320 14:52:40.803594 4856 generic.go:334] "Generic (PLEG): container finished" podID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" exitCode=0 Mar 20 14:52:40 crc kubenswrapper[4856]: I0320 14:52:40.803638 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" event={"ID":"e51a8789-c529-4a2c-b8f1-dc31a3c06403","Type":"ContainerDied","Data":"daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c"} Mar 20 14:52:40 crc kubenswrapper[4856]: I0320 14:52:40.803673 4856 scope.go:117] "RemoveContainer" containerID="9aea58c4b4e51c36881fb1ce503417b21ed62913b9282f3435adab651f87e00e" Mar 20 14:52:40 crc kubenswrapper[4856]: I0320 14:52:40.804203 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:52:40 crc kubenswrapper[4856]: E0320 14:52:40.804486 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:52:51 crc kubenswrapper[4856]: I0320 14:52:51.820926 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:52:51 crc kubenswrapper[4856]: E0320 14:52:51.821893 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.882748 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:01 crc kubenswrapper[4856]: E0320 14:53:01.883645 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93e1d03a-4c8c-49c6-a5e8-42711ca713e3" containerName="oc" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.883660 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="93e1d03a-4c8c-49c6-a5e8-42711ca713e3" containerName="oc" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.883888 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="93e1d03a-4c8c-49c6-a5e8-42711ca713e3" containerName="oc" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.885356 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.885580 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.951944 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.952204 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:01 crc kubenswrapper[4856]: I0320 14:53:01.952371 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9wh\" (UniqueName: \"kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.053790 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9wh\" (UniqueName: \"kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.053915 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.054002 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.054538 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.054571 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.075190 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9wh\" (UniqueName: \"kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh\") pod \"certified-operators-td9dp\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.251935 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.727303 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.819781 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:53:02 crc kubenswrapper[4856]: E0320 14:53:02.820156 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.993724 4856 generic.go:334] "Generic (PLEG): container finished" podID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerID="76c045b83d75c59715db7a314c8569e4c04152e41bb1c289678b00767d8d49ec" exitCode=0 Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.993822 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerDied","Data":"76c045b83d75c59715db7a314c8569e4c04152e41bb1c289678b00767d8d49ec"} Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.994054 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerStarted","Data":"e573fffdc8e38e444c00359ee41ff220e5fbc5b12d4fea817cfb0f4bf363d88e"} Mar 20 14:53:02 crc kubenswrapper[4856]: I0320 14:53:02.995608 4856 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:53:05 crc kubenswrapper[4856]: I0320 14:53:05.013299 4856 generic.go:334] "Generic (PLEG): container finished" podID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerID="63f641fdd5d8c45cbe1ec98a011623ed4810ef3fbd85aca7f1916564e6f718b3" exitCode=0 Mar 20 14:53:05 crc kubenswrapper[4856]: I0320 14:53:05.013348 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerDied","Data":"63f641fdd5d8c45cbe1ec98a011623ed4810ef3fbd85aca7f1916564e6f718b3"} Mar 20 14:53:06 crc kubenswrapper[4856]: I0320 14:53:06.022848 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerStarted","Data":"6d416c1c66c6b9de7b4d6ed3a593214f0bff0b7876c67fd564ca703b2f7ee37d"} Mar 20 14:53:06 crc kubenswrapper[4856]: I0320 14:53:06.047356 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-td9dp" podStartSLOduration=2.6567325630000003 podStartE2EDuration="5.047340197s" podCreationTimestamp="2026-03-20 14:53:01 +0000 UTC" firstStartedPulling="2026-03-20 14:53:02.995376289 +0000 UTC m=+5397.876402419" lastFinishedPulling="2026-03-20 14:53:05.385983923 +0000 UTC m=+5400.267010053" observedRunningTime="2026-03-20 14:53:06.042107883 +0000 UTC m=+5400.923134033" watchObservedRunningTime="2026-03-20 14:53:06.047340197 +0000 UTC m=+5400.928366327" Mar 20 14:53:12 crc kubenswrapper[4856]: I0320 14:53:12.252322 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:12 crc kubenswrapper[4856]: I0320 14:53:12.253059 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:12 crc kubenswrapper[4856]: I0320 14:53:12.332649 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:13 crc kubenswrapper[4856]: I0320 14:53:13.124093 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:13 crc kubenswrapper[4856]: I0320 14:53:13.166136 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:13 crc kubenswrapper[4856]: I0320 14:53:13.819514 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:53:13 crc kubenswrapper[4856]: E0320 14:53:13.819864 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:53:15 crc kubenswrapper[4856]: I0320 14:53:15.101415 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-td9dp" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="registry-server" containerID="cri-o://6d416c1c66c6b9de7b4d6ed3a593214f0bff0b7876c67fd564ca703b2f7ee37d" gracePeriod=2 Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.114829 4856 generic.go:334] "Generic (PLEG): container finished" podID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerID="6d416c1c66c6b9de7b4d6ed3a593214f0bff0b7876c67fd564ca703b2f7ee37d" exitCode=0 Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.114898 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerDied","Data":"6d416c1c66c6b9de7b4d6ed3a593214f0bff0b7876c67fd564ca703b2f7ee37d"} Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.638583 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.685680 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities\") pod \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.685839 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9wh\" (UniqueName: \"kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh\") pod \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.685889 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content\") pod \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\" (UID: \"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc\") " Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.688124 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities" (OuterVolumeSpecName: "utilities") pod "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" (UID: "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.692187 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh" (OuterVolumeSpecName: "kube-api-access-nc9wh") pod "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" (UID: "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc"). InnerVolumeSpecName "kube-api-access-nc9wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.736029 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" (UID: "d9a7f333-3a9e-438b-a4e3-d17fd1a701cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.787228 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9wh\" (UniqueName: \"kubernetes.io/projected/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-kube-api-access-nc9wh\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.787260 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:16 crc kubenswrapper[4856]: I0320 14:53:16.787284 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.127500 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-td9dp" event={"ID":"d9a7f333-3a9e-438b-a4e3-d17fd1a701cc","Type":"ContainerDied","Data":"e573fffdc8e38e444c00359ee41ff220e5fbc5b12d4fea817cfb0f4bf363d88e"} Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.127559 4856 scope.go:117] "RemoveContainer" containerID="6d416c1c66c6b9de7b4d6ed3a593214f0bff0b7876c67fd564ca703b2f7ee37d" Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.127572 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-td9dp" Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.149136 4856 scope.go:117] "RemoveContainer" containerID="63f641fdd5d8c45cbe1ec98a011623ed4810ef3fbd85aca7f1916564e6f718b3" Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.166403 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.171641 4856 scope.go:117] "RemoveContainer" containerID="76c045b83d75c59715db7a314c8569e4c04152e41bb1c289678b00767d8d49ec" Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.175523 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-td9dp"] Mar 20 14:53:17 crc kubenswrapper[4856]: I0320 14:53:17.832514 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" path="/var/lib/kubelet/pods/d9a7f333-3a9e-438b-a4e3-d17fd1a701cc/volumes" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.923090 4856 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:20 crc kubenswrapper[4856]: E0320 14:53:20.923903 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="registry-server" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.923915 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="registry-server" Mar 20 14:53:20 crc kubenswrapper[4856]: E0320 14:53:20.923926 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="extract-content" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.923932 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="extract-content" Mar 20 14:53:20 crc kubenswrapper[4856]: E0320 14:53:20.923956 4856 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="extract-utilities" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.923963 4856 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="extract-utilities" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.924114 4856 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a7f333-3a9e-438b-a4e3-d17fd1a701cc" containerName="registry-server" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.928187 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.935119 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.949948 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv9tl\" (UniqueName: \"kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.952157 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:20 crc kubenswrapper[4856]: I0320 14:53:20.952222 4856 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.053738 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.054170 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.054240 4856 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv9tl\" (UniqueName: \"kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.054632 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.054763 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.074139 4856 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv9tl\" (UniqueName: \"kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl\") pod \"redhat-operators-vwh6j\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.254667 4856 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:21 crc kubenswrapper[4856]: I0320 14:53:21.762944 4856 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:21 crc kubenswrapper[4856]: W0320 14:53:21.772559 4856 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58fac0c0_6484_4553_8c22_461c96f25278.slice/crio-5949913c24b6e37716927e1f9046044da2d50d0a319371a5d3ec87e24becd7e8 WatchSource:0}: Error finding container 5949913c24b6e37716927e1f9046044da2d50d0a319371a5d3ec87e24becd7e8: Status 404 returned error can't find the container with id 5949913c24b6e37716927e1f9046044da2d50d0a319371a5d3ec87e24becd7e8 Mar 20 14:53:22 crc kubenswrapper[4856]: I0320 14:53:22.164563 4856 generic.go:334] "Generic (PLEG): container finished" podID="58fac0c0-6484-4553-8c22-461c96f25278" containerID="a96ee67b0e1cb78cb80828d3d02905a20437f493d911707e7bd40750d98a94ed" exitCode=0 Mar 20 14:53:22 crc kubenswrapper[4856]: I0320 14:53:22.164617 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerDied","Data":"a96ee67b0e1cb78cb80828d3d02905a20437f493d911707e7bd40750d98a94ed"} Mar 20 14:53:22 crc kubenswrapper[4856]: I0320 14:53:22.164651 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerStarted","Data":"5949913c24b6e37716927e1f9046044da2d50d0a319371a5d3ec87e24becd7e8"} Mar 20 14:53:23 crc kubenswrapper[4856]: I0320 14:53:23.173823 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerStarted","Data":"1e41336e93744ffe856e6d79a6d5f9e6fd36ef4955605ac32b799f64d52c7b87"} Mar 20 14:53:24 crc kubenswrapper[4856]: I0320 14:53:24.183153 4856 generic.go:334] "Generic (PLEG): container finished" podID="58fac0c0-6484-4553-8c22-461c96f25278" containerID="1e41336e93744ffe856e6d79a6d5f9e6fd36ef4955605ac32b799f64d52c7b87" exitCode=0 Mar 20 14:53:24 crc kubenswrapper[4856]: I0320 14:53:24.183204 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerDied","Data":"1e41336e93744ffe856e6d79a6d5f9e6fd36ef4955605ac32b799f64d52c7b87"} Mar 20 14:53:25 crc kubenswrapper[4856]: I0320 14:53:25.196068 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerStarted","Data":"16c520bec8c95640dc920d7ea6c4adca23c2c9fbe1e9b9a28ecf91a29cf568d6"} Mar 20 14:53:25 crc kubenswrapper[4856]: I0320 14:53:25.233010 4856 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vwh6j" podStartSLOduration=2.788898518 podStartE2EDuration="5.232990207s" podCreationTimestamp="2026-03-20 14:53:20 +0000 UTC" firstStartedPulling="2026-03-20 14:53:22.167972573 +0000 UTC m=+5417.048998703" lastFinishedPulling="2026-03-20 14:53:24.612064262 +0000 UTC m=+5419.493090392" observedRunningTime="2026-03-20 14:53:25.227837636 +0000 UTC m=+5420.108863786" watchObservedRunningTime="2026-03-20 14:53:25.232990207 +0000 UTC m=+5420.114016337" Mar 20 14:53:27 crc kubenswrapper[4856]: I0320 14:53:27.819672 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:53:27 crc kubenswrapper[4856]: E0320 14:53:27.820552 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:53:31 crc kubenswrapper[4856]: I0320 14:53:31.255586 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:31 crc kubenswrapper[4856]: I0320 14:53:31.255915 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:31 crc kubenswrapper[4856]: I0320 14:53:31.303737 4856 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:32 crc kubenswrapper[4856]: I0320 14:53:32.333535 4856 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:32 crc kubenswrapper[4856]: I0320 14:53:32.537038 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:34 crc kubenswrapper[4856]: I0320 14:53:34.292533 4856 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vwh6j" podUID="58fac0c0-6484-4553-8c22-461c96f25278" containerName="registry-server" containerID="cri-o://16c520bec8c95640dc920d7ea6c4adca23c2c9fbe1e9b9a28ecf91a29cf568d6" gracePeriod=2 Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.302471 4856 generic.go:334] "Generic (PLEG): container finished" podID="58fac0c0-6484-4553-8c22-461c96f25278" containerID="16c520bec8c95640dc920d7ea6c4adca23c2c9fbe1e9b9a28ecf91a29cf568d6" exitCode=0 Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.302593 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerDied","Data":"16c520bec8c95640dc920d7ea6c4adca23c2c9fbe1e9b9a28ecf91a29cf568d6"} Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.643147 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.779695 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities\") pod \"58fac0c0-6484-4553-8c22-461c96f25278\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.779787 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content\") pod \"58fac0c0-6484-4553-8c22-461c96f25278\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.779937 4856 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv9tl\" (UniqueName: \"kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl\") pod \"58fac0c0-6484-4553-8c22-461c96f25278\" (UID: \"58fac0c0-6484-4553-8c22-461c96f25278\") " Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.781712 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities" (OuterVolumeSpecName: "utilities") pod "58fac0c0-6484-4553-8c22-461c96f25278" (UID: "58fac0c0-6484-4553-8c22-461c96f25278"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.788401 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl" (OuterVolumeSpecName: "kube-api-access-mv9tl") pod "58fac0c0-6484-4553-8c22-461c96f25278" (UID: "58fac0c0-6484-4553-8c22-461c96f25278"). InnerVolumeSpecName "kube-api-access-mv9tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.882155 4856 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.882190 4856 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv9tl\" (UniqueName: \"kubernetes.io/projected/58fac0c0-6484-4553-8c22-461c96f25278-kube-api-access-mv9tl\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.954184 4856 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58fac0c0-6484-4553-8c22-461c96f25278" (UID: "58fac0c0-6484-4553-8c22-461c96f25278"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:53:35 crc kubenswrapper[4856]: I0320 14:53:35.983681 4856 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58fac0c0-6484-4553-8c22-461c96f25278-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.317027 4856 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vwh6j" event={"ID":"58fac0c0-6484-4553-8c22-461c96f25278","Type":"ContainerDied","Data":"5949913c24b6e37716927e1f9046044da2d50d0a319371a5d3ec87e24becd7e8"} Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.317085 4856 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vwh6j" Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.317100 4856 scope.go:117] "RemoveContainer" containerID="16c520bec8c95640dc920d7ea6c4adca23c2c9fbe1e9b9a28ecf91a29cf568d6" Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.350211 4856 scope.go:117] "RemoveContainer" containerID="1e41336e93744ffe856e6d79a6d5f9e6fd36ef4955605ac32b799f64d52c7b87" Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.356449 4856 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.362302 4856 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vwh6j"] Mar 20 14:53:36 crc kubenswrapper[4856]: I0320 14:53:36.378814 4856 scope.go:117] "RemoveContainer" containerID="a96ee67b0e1cb78cb80828d3d02905a20437f493d911707e7bd40750d98a94ed" Mar 20 14:53:37 crc kubenswrapper[4856]: I0320 14:53:37.829758 4856 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fac0c0-6484-4553-8c22-461c96f25278" path="/var/lib/kubelet/pods/58fac0c0-6484-4553-8c22-461c96f25278/volumes" Mar 20 14:53:41 crc kubenswrapper[4856]: I0320 14:53:41.819906 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:53:41 crc kubenswrapper[4856]: E0320 14:53:41.821060 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" Mar 20 14:53:55 crc kubenswrapper[4856]: I0320 14:53:55.824713 4856 scope.go:117] "RemoveContainer" containerID="daf6918ee71e0258af12955355789cd76eb580ff513022b6f9abc1a1ffa4785c" Mar 20 14:53:55 crc kubenswrapper[4856]: E0320 14:53:55.825533 4856 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dhzh4_openshift-machine-config-operator(e51a8789-c529-4a2c-b8f1-dc31a3c06403)\"" pod="openshift-machine-config-operator/machine-config-daemon-dhzh4" podUID="e51a8789-c529-4a2c-b8f1-dc31a3c06403" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157257620024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157257621017375 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157244527016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157244527015471 5ustar corecore